Skip to main content

Recognize Landmark In Image Using Firebase ML Kit In Android Studio 2020(Complete Guide) With Source Code | Step By Step Tutorial | 5 Simple Step

In this post, we're going to implement landmark recognization in an app using the ml kit of the firebase.


Step 1: Add Firebase to your android project:

I recommend you to see how to add firebase to the android project in 5minutes to know how to add it or if you already add it then you can move on to 2nd Step. 

Step 2: Add this dependency for the ml kit android libraries to your app-level build.gradle file:

implementation 'com.google.firebase:firebase-ml-vision:24.0.1'

as shown below:

then click on sync now.


Step 3: Design the layout of the activity:

<?xml version="1.0" encoding="utf-8"?><RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"    android:layout_width="match_parent"    android:layout_height="match_parent">
    <ImageView        android:id="@+id/image"        android:layout_width="500dp"        android:layout_height="500dp"        android:layout_above="@+id/selectImage"        android:layout_margin="30dp" />
    <Button        android:id="@+id/selectImage"        android:layout_width="wrap_content"        android:layout_height="wrap_content"        android:layout_centerInParent="true"        android:text="Select Image !" />
    <TextView        android:id="@+id/text"        android:layout_width="wrap_content"        android:layout_height="wrap_content"        android:layout_below="@id/selectImage"        android:layout_margin="30dp"        android:textColor="@android:color/black"        android:textSize="15sp" /></RelativeLayout>

as shown below:


Step 4: Select Image from the device:

I recommend you to first go through the post on how to select or capture an image from the device before going further.

So now, let's open the image cropping activity to select the image on button click:



and now get the image by overriding onActivityResult method:


Step 5: Get the information about the recognized landmark:

**You need to upgrade to Blaze plan for using the landmark recognization**

So after upgrading follow this step:
1. Prepare the input image.
2. Configure and run the landmark detector.
3. Get information about the recognized landmark.

There are 5 ways of getting a firebase vision image object (Prepare the input image):
(i)By Bitmap,
(ii)By media.image,
(iii)By ByteBuffer,
(iv)By ByteArray,
(v)By File on device

We're creating using file path(last option) if u want to know how to create from other option then comment down below:

private void landmarkRecognitionFromImage(Uri uri) {

    try {
        //1. Prepare the input image        FirebaseVisionImage image = FirebaseVisionImage.fromFilePath(MainActivity.this, uri);
        //2. Configure and run the landmark detector        FirebaseVisionCloudDetectorOptions options =
                new FirebaseVisionCloudDetectorOptions.Builder()
                        .setModelType(FirebaseVisionCloudDetectorOptions.LATEST_MODEL)
                        .setMaxResults(15)
                        .build();
            //default setting:         // FirebaseVisionCloudLandmarkDetector detector = FirebaseVision.getInstance()         // .getVisionCloudLandmarkDetector();
         // To change the default settings:        FirebaseVisionCloudLandmarkDetector detector = FirebaseVision.getInstance()
                .getVisionCloudLandmarkDetector(options);
        //run the landmark detector        detector.detectInImage(image)
                .addOnSuccessListener(new OnSuccessListener<List<FirebaseVisionCloudLandmark>>() {
                    @Override                    public void onSuccess(List<FirebaseVisionCloudLandmark> firebaseVisionCloudLandmarks) {
                        for (FirebaseVisionCloudLandmark landmark : firebaseVisionCloudLandmarks) {
                           //3. Get information about the recognized landmark                            Rect bounds = landmark.getBoundingBox();                            String landmarkName = landmark.getLandmark();                            String entityId = landmark.getEntityId();                            float confidence = landmark.getConfidence();                            textView.append("bounds " + bounds + "\n" + "landmarkName " + landmarkName + "\n" + "entityId " + entityId + "\n" + "confidence " + confidence);                            // Multiple locations are possible, e.g., the location of the depicted                            // landmark and the location the picture was taken.                            for (FirebaseVisionLatLng loc : landmark.getLocations()) {
                                double latitude = loc.getLatitude();                                double longitude = loc.getLongitude();                                textView.append("latitude " + latitude + "\n");                                textView.append("longtiude " + longitude + "\n");                            }
                        }
                    }
                })
                .addOnFailureListener(new OnFailureListener() {
                    @Override                    public void onFailure(@NonNull Exception e) {
                        // Task failed with an exception                        // ...                    }
                });    } catch (IOException e) {
        e.printStackTrace();    }
}

Now, run the app :)

If everything is done correctly then you see the excepted output.

See the firebase doc for full reference.

You can see the full source code at GitHub.

If you face any problem or have any suggestion please comment it down we love to answer it.

Comment down what next topic you need a guide on? or Drop a message on our social media handle

 Happy coding and designing : )



Comments

Popular posts from this blog

Face Recognition Using Firebase ML Kit In Android Studio 2020 (Complete Guide) | Step By Step Tutorial

Today, in this post we are going to make an android app that recognizes facial features from an image using a firebase ml kit in the android studio.                                                      After doing all the steps the output will look like this: Step 1: Add Firebase to your android project: I recommend you to see  how to add firebase to the android project in 5minutes  to know how to add it or if you already add it then you can move on to 2nd Step.      Step 2:  Add this dependency for the ml kit android libraries to your app-level build.gradle file:   implementation 'com.google.firebase:firebase-ml-vision:24.0.1'   // If you want to detect face contours (landmark detection and classification   // don't require this additional model):   implementation 'com.google.firebase:...

Select (or Capture) and Crop Image In Android Studio 2020 (Complete Guide) | Step By Step Guide

In, this post we're going to make an app that captures or selects an image and then displays in an image view using a third party library - android image cropper by ArthurHub at Github. Step 1: Add Dependency : Open android studio and paste this dependency in app-level build.gradle file as shown below: implementation 'com.theartofdev.edmodo:android-image-cropper:2.7.+' and then click on Sync Now. Step 2: Design the main activity layout : Add a Button and an ImageView to select and display image respectively as shown below: Step 3: Modify AndroidMainfest.xml by adding the CropImageActivity : <activity android:name="com.theartofdev.edmodo.cropper.CropImageActivity" android:screenOrientation="portrait" android:theme="@style/Base.Theme.AppCompat"/>  as shown below- Step 4: Open CropImageActivity on Click of a button : Step 5: Lastly, override the On Activity Result and update ImageView : ...

Translate Text Using Firebase ML Kit In Android Studio 2020 (Complete Guide) With Source Code |

In this post, we're going to translate the text from one to another language using ml kit of a firebase in android studio. We're translating to English but you can translate to any language you required So, This is the output after you are done with all step: So, now make it happen :) Step 1: Add Firebase to your android project: I recommend you to see  how to add firebase to the android project in 5minutes  to know how to add it or if you already add it then you can move on to 2nd Step. Step 2: Add this dependency for the ml kit android libraries to your app-level build.gradle file:   implementation 'com.google.firebase:firebase-ml-natural-language:22.0.0'   implementation 'com.google.firebase:firebase-ml-natural-language-translate-model:20.0.7' as shown below and then click on Sync Now.: Step 3: Design the layout of the activity: <? xml version ="1.0" encoding ="utf-8" ?> <RelativeLayout xmlns: andr...



DMCA.com Protection Status

Copywrite © 2021 The MindfulCode