Saturday, May 31, 2014

Ford Sync AppLink Android Demo - Playing Online Audio

*** I have this TDK available for sale. Anyone interested, shoot me an email or a comment to discuss ***

    Back in March I won a Ford Sync Test Development Kit (TDK), a device for testing mobile integration with Ford vehicles. As of January, 2014, there are 1.5 million vehicles in North America that support AppLink, and the AppLink functionality will be released in Europe and Asia this year. 

    For this tutorial I will go over creating an app that plays an online audio stream (Denver's Comedy 103.1 icecast stream) from a mobile device over the Ford Sync system. The app supports audio playback, console and steering wheel button interactions, displaying text on the console and opening with a voice command. Below is a video I recorded today showing the app in use. Notice that the app itself is never brought to the foreground on the phone, as the only requirement is that the phone be on and paired with the system. All code for this demo is on GitHub.


    The first thing that we need to do is grab the AppLinkSDKAndroid-2-2 .jar file from http://developer.ford.com and place it into the project's lib folder. We then need to open the manifest to register required permissions, a receiver for the Sync bluetooth events and the service that controls the interactions between the Android device and the Ford Sync console. 

Permissions:
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />

Service and Receiver:
<service android:name=".Service.AppLinkService"></service>

<receiver android:name=".Receiver.AppLinkReceiver">
    <intent-filter>
        <action android:name="android.bluetooth.adapter.action.STATE_CHANGED" />
        <action android:name="android.bluetooth.device.action.ACL_CONNECTED" />
        <action android:name="android.bluetooth.device.action.ACL_DISCONNECTED"/>
        <action android:name="android.intent.action.BOOT_COMPLETED" />
        <action android:name="android.media.AUDIO_BECOMING_NOISY" />
    </intent-filter>
</receiver>

    It should be noted that the receiver should listen for five different actions pertaining to connection and signal noise. These actions help us determine when the audio service should be stopped or started.

    Once the manifest has been filled in, we can move over to our MainActivity file. In onCreate, we just need to call a method to start the AppLink service and proxy, which listens for events from the Ford Sync console. This allows the app to account for when the app is opened and then the user enters their vehicle, whereas our receiver will handle starting the service when the app is not open.

private void startSyncProxyService() {
    boolean isPaired = false;
    BluetoothAdapter btAdapter = BluetoothAdapter.getDefaultAdapter();

    if( btAdapter != null ) {
        if( btAdapter.isEnabled() && btAdapter.getBondedDevices() != null && !btAdapter.getBondedDevices().isEmpty() ) {
            for( BluetoothDevice device : btAdapter.getBondedDevices() ) {
                if( device.getName() != null && device.getName().contains( getString( R.string.device_name ) ) ) {
                    isPaired = true;
                    break;
                }
            }
        }

        if( isPaired ) {
            if( AppLinkService.getInstance() == null ) {
                Intent appLinkServiceIntent = new Intent( this, AppLinkService.class );
                startService( appLinkServiceIntent );
            } else {
                SyncProxyALM proxyInstance = AppLinkService.getInstance().getProxy();
                if( proxyInstance == null ) {
                    AppLinkService.getInstance().startProxy();
                }
            }
        }
    }
}

    The inverse to startSyncProxyService in onCreate is endSyncProxyService, which is called when MainActivity is destroyed.

private void endSyncProxyInstance() {
    if( AppLinkService.getInstance() != null ) {
        SyncProxyALM proxy = AppLinkService.getInstance().getProxy();
        if( proxy != null ) {
            AppLinkService.getInstance().reset();
        } else {
            AppLinkService.getInstance().startProxy();
        }
    }
}

    Next we need to fill in the AppLinkReceiver class, which extends BroadcastReceiver. The only method we need from the receiver is onReceive, which takes an intent and checks the actions associated with that intent in order to determine if the AppLinkService should be started or stopped.

@Override
public void onReceive(Context context, Intent intent) {
    if( intent == null || intent.getAction() == null || context == null )
        return;

    BluetoothDevice device = intent.getParcelableExtra( BluetoothDevice.EXTRA_DEVICE );
    String action = intent.getAction();

    Intent serviceIntent = new Intent( context, AppLinkService.class );
    serviceIntent.putExtras( intent );


    //Should start service
    if( action.compareTo(BluetoothDevice.ACTION_ACL_CONNECTED) == 0 &&
            device != null &&
            device.getName() != null &&
            device.getName().contains( context.getString( R.string.device_name ) ) &&
            AppLinkService.getInstance() == null )
    {
        context.startService(serviceIntent);
    }

    else if( action.equals( Intent.ACTION_BOOT_COMPLETED ) &&
            BluetoothAdapter.getDefaultAdapter() != null &&
            BluetoothAdapter.getDefaultAdapter().isEnabled() ) {
        context.startService(serviceIntent);

    }

    //Should stop service
    else if( action.equals( BluetoothDevice.ACTION_ACL_DISCONNECTED ) &&
            device != null &&
            device.getName() != null &&
            device.getName().contains( context.getString( R.string.device_name ) ) &&
            AppLinkService.getInstance() != null )
    {
        context.stopService( intent );
    }

    else if( action.equals(BluetoothAdapter.ACTION_STATE_CHANGED ) &&
        intent.getIntExtra(BluetoothAdapter.EXTRA_STATE, -1) == BluetoothAdapter.STATE_TURNING_OFF &&
        AppLinkService.getInstance() != null )
    {
        context.stopService( serviceIntent );
    }

    else if( action.equals( AudioManager.ACTION_AUDIO_BECOMING_NOISY ) ) {
        context.stopService( serviceIntent );
    }
}

    Now that we have the infrastructure for starting and stopping our AppLinkService, it's time to build the service file. We start off by extending Service and implementing Ford's IProxyListenerALM interface, then importing all of the required stub methods associated with it. At the top of the class we should define our class variables:

private static AppLinkService mInstance = null;
private MediaPlayer mPlayer = null;
private SyncProxyALM mProxy = null;
private int mCorrelationId = 0;

    Because of the way Ford handles the AppLinkService and the requirement that there only be one AppLinkService and SyncProxyALM to control all interactions between an app and the console, we must keep an instance of our service and proxy. We also need to keep track of an integer value (mCorrelationId) to differentiate messages between the console and app. When the service is started in onStartCommand, we need to check that bluetooth is available and enabled, and then start the proxy.

@Override
public int onStartCommand(Intent intent, int flags, int startId) {
    if( intent != null &&
        BluetoothAdapter.getDefaultAdapter() != null &&
        BluetoothAdapter.getDefaultAdapter().isEnabled() ) {
        startProxy();
    }

    return START_STICKY;
}

    The startProxy method simply creates a SyncProxyALM object with a title and number for identifying the app. The title used here is what's used for starting the app with vocal commands from the console.

public void startProxy() {
    if( mProxy == null ) {
        try {
            mProxy = new SyncProxyALM( this, getString( R.string.display_title ), true, getString( R.string.app_link_id ) );
        } catch( SyncException e ) {
            if( mProxy == null ) {
                stopSelf();
            }
        }
    }
}

    When the app connects to Sync and starts on the console, the onOnHMIStatus method is called with an OnHMIStatus object passed in. This is where we determine if audio should be played or stopped, display initial text on the screen and we determine what buttons to subscribe to.

@Override
public void onOnHMIStatus(OnHMIStatus onHMIStatus) {
    switch( onHMIStatus.getSystemContext() ) {
        case SYSCTXT_MAIN:
        case SYSCTXT_VRSESSION:
        case SYSCTXT_MENU:
            break;
        default:
            return;
    }

    switch( onHMIStatus.getAudioStreamingState() ) {
        case AUDIBLE: {
            playAudio();
            break;
        }
        case NOT_AUDIBLE: {
            stopAudio();
            break;
        }
    }

    if( mProxy == null )
        return;

    if( onHMIStatus.getHmiLevel().equals( HMILevel.HMI_FULL ) && onHMIStatus.getFirstRun() ) {
        //setup app with SYNC
        try {
            mProxy.show( "Welcome to Paul's", "Ford AppLink Demo", TextAlignment.CENTERED, mCorrelationId++ );
        } catch( SyncException e ) {}
        subscribeToButtons();
    }
}

    subscribeToButtons simply subscribes through the control proxy, and the service listens for button presses and performs actions based on which button comes through Sync's event listener

private void subscribeToButtons() {
    if( mProxy == null )
        return;

    try {
        mProxy.subscribeButton( ButtonName.OK, mCorrelationId++ );
    } catch( SyncException e ) {}
}

@Override
public void onOnButtonPress(OnButtonPress notification) {
    if( ButtonName.OK == notification.getButtonName() ) {
        if( mPlayer != null ) {
            if( mPlayer.isPlaying() ) {
                stopAudio();
            } else {
                playAudio();
            }
        }
    }
}

    The final steps to implementing audio over AppLink is playing and stopping the audio, which is handled exactly as it would be in any other app, where the MediaPlayer object is created and set with a URL, then started once the stream has loaded. The only difference here is that the proxy is used to display text to the user's console as the operations happen.

private void playAudio() {
    String url = "http://173.231.136.91:8060/";
    if( mPlayer == null )
        mPlayer = new MediaPlayer();

    try {
        mProxy.show("Loading...", "", TextAlignment.CENTERED, mCorrelationId++);
    } catch( SyncException e ) {}

    mPlayer.reset();
    mPlayer.setAudioStreamType( AudioManager.STREAM_MUSIC );
    try {
        mPlayer.setDataSource(url);
        mPlayer.setOnPreparedListener( new MediaPlayer.OnPreparedListener() {
            @Override
            public void onPrepared( MediaPlayer mediaPlayer ) {
                mediaPlayer.start();
                try {
                    mProxy.show("Playing online audio", "", TextAlignment.CENTERED, mCorrelationId++);
                } catch( SyncException e ) {}
            }
        });
        mPlayer.prepare();
    } catch (IllegalArgumentException e) {
    } catch (SecurityException e) {
    } catch (IllegalStateException e) {
    } catch (IOException e) {
    }
}

private void stopAudio() {
    if( mPlayer == null )
        return;
    mPlayer.pause();
    try {
        mProxy.show("Press OK", "to play audio", TextAlignment.CENTERED, mCorrelationId++);
    } catch( SyncException e ) {}
}

    And with that, we have a simple audio service integrated with Ford Sync AppLink. There's a lot more that can be done with the system, such as text to speech, using GPS, and integrating any information available from the phone via sensors or online data. As more vehicles end up with this system in the world, it'll be interesting to see what else people do with it.

Friday, May 30, 2014

Introduction to Geofencing with a Service

   ** LocationClient was deprecated in a later release of Play Services from this post. The geofencing code should still apply, just switch to using the new 7.5 version of location **

   Geofencing is a feature in Google Play Services that allows for checking when a user has entered or left a circular area. This feature is implemented using Google's Location Services, so it relies on location data from cellular towers, wireless networks and GPS. While this can be a powerful tool, it should be used conservatively as continuously polling for a users location can be taxing on the device battery. It should also be noted that geofencing can take some time to register if the area has been entered or left, so you should plan accordingly when designing any apps that use this feature. For this post I have decided to put together an app that creates a geofence around the users starting location and post a notification when the user enters or leaves the fence. All code for this demo can be found on GitHub.

    The first thing that should be done is to update the gradle.build file to include Google Play Services.

compile 'com.google.android.gms:play-services:4.+'

    Next the manifest should be edited to allow for the ACCESS_FINE_LOCATION permission in order to use the device's GPS functionality. Meta-data for the Google Play Services version should be included as a tag, and the service that will be used for handling what action to take on a geofence event should be declared. For this demo I also force the app to be in portrait mode in order to avoid the broiler plate code required for reconnecting to Play Services on rotate.

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.ptrprograms.geofencing" >

    <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />

    <application
        android:allowBackup="true"
        android:icon="@drawable/ic_launcher"
        android:label="@string/app_name"
        android:theme="@style/AppTheme" >
        <activity
            android:name=".Activity.MainActivity"
            android:label="@string/app_name"
            android:screenOrientation="portrait">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <service android:name=".Service.GeofencingService" />

        <meta-data
            android:name="com.google.android.gms.version"
            android:value="@integer/google_play_services_version" />

    </application>

</manifest>


    Once the manifest file is set, then the main activity for the application should be set to implement GooglePlayServices connection callbacks and LocationClient Geofencing result and removal listeners. These are generally used for doing specific actions when PlayServices has connected or disconnected, as well as handling geofencing events.

public class MainActivity extends Activity implements 
        GooglePlayServicesClient.ConnectionCallbacks,
        GooglePlayServicesClient.OnConnectionFailedListener,
        LocationClient.OnAddGeofencesResultListener,
        LocationClient.OnRemoveGeofencesResultListener {

     For now we'll leave the required methods as stubs and get back to them once everything else is ready. The first thing we should do in our activity, after calling setContentView, is verify that the device has Google Play Services available. If not, we can take action to alert the user to download it from the play store, or in this case simply end the activity.

private void verifyPlayServices() {
    switch ( GooglePlayServicesUtil.isGooglePlayServicesAvailable( this ) ) {
        case ConnectionResult.SUCCESS: {
            break;
        }
        case ConnectionResult.SERVICE_VERSION_UPDATE_REQUIRED: {
            finish();
        }
        default: {
            finish();
        }
    }
}

    Once Play Services have been confirmed, we can create a reference to the device LocationClient, and create a PendingIntent to start our service (GeofencingService.class) that handles performing an action when the user interacts with a geofence.

mLocationClient = new LocationClient( this, this, this );
mIntent = new Intent( this, GeofencingService.class );
mPendingIntent = PendingIntent.getService( this, 0, mIntent, PendingIntent.FLAG_UPDATE_CURRENT );

    After this point you can start the geofence operations whenever appropriate for your application. In this case the operations are handled by a simple ToggleButton. In order to start listening for geofence events, you need to use the Geofence.Builder class to create a fence at a given lat/lng pair with a radius (in meters), specify if the fence should listen for entering, exiting or both interactions, and set an expiration time on it. For this example I am using the user's current location, a radius of 100 meters, listening for both entering and exiting events, and never letting the geofence listeners expire since I am controlling them myself. Once the geofence is built, it is placed in an ArrayList of Geofence objects and added to the location client with our pending intent so that when the fence is triggered, the PendingIntent is fired off.

private void startGeofence() {
    Location location = mLocationClient.getLastLocation();

    Geofence.Builder builder = new Geofence.Builder();
    mGeofence = builder.setRequestId( FENCE_ID )
            .setCircularRegion( location.getLatitude(), location.getLongitude(), RADIUS )
            .setTransitionTypes( Geofence.GEOFENCE_TRANSITION_ENTER | Geofence.GEOFENCE_TRANSITION_EXIT )
            .setExpirationDuration( Geofence.NEVER_EXPIRE )
            .build();

    ArrayList<Geofence> geofences = new ArrayList<Geofence>();
    geofences.add( mGeofence );
    mLocationClient.addGeofences( geofences, mPendingIntent, this );
}

    At this point we need to populate the LocationClient.OnAddGeofencesResultListener onAddGeofencesResult method. This method only has one responsibility: initiate the IntentService that will handle geofence events.

public void onAddGeofencesResult(int status, String[] geofenceIds ) {
    if( status == LocationStatusCodes.SUCCESS ) {
        Intent intent = new Intent( mIntent );
        startService( intent );
    }
}

    Once the service is created and the PendingIntent is associated with LocationClient, our service should receive intents every time the geofence is entered or exited, even if the app itself is in the background. The only thing we're doing with the service in this case is posting a notification for the user to inform them that they have entered or exited our specified area by checking the LocationClient.getGeofenceTransition( intent ) method. An introduction to notifications can be found in one of my earlier posts.

Entering the geofenced area
@Override
protected void onHandleIntent( Intent intent ) {
    NotificationCompat.Builder builder = new NotificationCompat.Builder( this );
    builder.setSmallIcon( R.drawable.ic_launcher );
    builder.setDefaults( Notification.DEFAULT_ALL );
    builder.setOngoing( true );

    int transitionType = LocationClient.getGeofenceTransition( intent );
    if( transitionType == Geofence.GEOFENCE_TRANSITION_ENTER ) {
        builder.setContentTitle( "Geofence Transition" );
        builder.setContentText( "Entering Geofence" );
        mNotificationManager.notify( 1, builder.build() );
    }
    else if( transitionType == Geofence.GEOFENCE_TRANSITION_EXIT ) {
        builder.setContentTitle( "Geofence Transition" );
        builder.setContentText( "Exiting Geofence" );
        mNotificationManager.notify( 1, builder.build() );
    }
}

Exiting the geofenced area 
    In order to remove the fences, we simply call removeGeofences with the PendingIntent and context passed in.

private void stopGeofence() {
    mLocationClient.removeGeofences( mPendingIntent, this );
}

    One important thing to notice here is that while the fence is removed, we also need to kill the background service that is waiting for geofence events. This is where the LocationClient.OnRemoveGeofenceListener interface comes into play. Since we are removing the geofences by passing in our intent, we need to populate the onRemoveGeofencesByPendingIntentResult method.

@Override
public void onRemoveGeofencesByPendingIntentResult(int status, PendingIntent pendingIntent) {
    if( status == LocationStatusCodes.SUCCESS ) {
        stopService( mIntent );
    }
}

    And with that, we have everything we need to create and use geofences within an Android application. There's a lot that can be done with this feature, such as added a "warmer/colder" feature to a scavenger hunt, keeping track of when a user enters a store, and monitoring when a user is near a landmark to name a few. I hope this tutorial helps others create something awesome, and good luck!

Monday, May 26, 2014

Making a Flashlight

    I'm pretty sure you can't call yourself an Android developer without making at least one flashlight app, right? Let's just go ahead and get that out of the way then. The source code is available on GitHub.

    The first thing that needs to be done is to declare permissions in the manifest

<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />

    I also modify the activity to handle configuration changes so that the activity doesn't destroy itself on rotate. This prevents the flashlight from turning off and then back on when the device orientation is changed.

<activity
    android:name="com.ptrprograms.flashlight.MainActivity"
    android:label="@string/app_name"
    android:configChanges="orientation|keyboardHidden|screenSize">
    <intent-filter>
        <action android:name="android.intent.action.MAIN" />

        <category android:name="android.intent.category.LAUNCHER" />
    </intent-filter>
</activity>

    The layout for this demo is a simple toggle button located in the center of the screen. That button is initialized in MainActivity, and the click listener verifies that the device has the camera flash feature.

private void initFlashlightButton() {
    ToggleButton button = (ToggleButton) findViewById( R.id.button_flashlight );
    button.setOnClickListener( new View.OnClickListener() {
        @Override
        public void onClick(View view) {
            if( getPackageManager().hasSystemFeature( PackageManager.FEATURE_CAMERA_FLASH ) )
            {
                if( mFlashlightOn )
                    activateFlashlight();
                else
                    deactivateFlashlight();
            }
        }
    });
}

    The activateFlashlight method gets the parameter information from the device camera and adds a parameter to turn on the flash in torch mode, meaning the flash stays on until deactivated, rather than 'flashing.'

private void activateFlashlight()
{
    if( mCamera == null )
        mCamera = Camera.open();

    mParameters = mCamera.getParameters();
    mParameters.setFlashMode( Camera.Parameters.FLASH_MODE_TORCH );
    mCamera.setParameters( mParameters );
    mFlashlightOn = true;
}

    The deactivateFlashlight method does the same thing, but in reverse. Instead of setting the flash mode to FLASH_MODE_TORCH, it uses Camera.Parameters.FLASH_MODE_OFF.

private void deactivateFlashlight()
{
    if( mCamera == null || mParameters == null )
        return;

    mParameters = mCamera.getParameters();
    mParameters.setFlashMode( Camera.Parameters.FLASH_MODE_OFF );
    mCamera.setParameters( mParameters );
    mFlashlightOn = false;
}

    The final part to building a simple flashlight app is releasing the camera resource on destroy. This turns the light off when the app is exited.

@Override
protected void onDestroy() {
    super.onDestroy();
    if( mCamera == null )
        return;

    mCamera.release();
}

    And that's that, a simple flashlight app to add to the toolbox.

Sunday, May 25, 2014

Using the Navigation Drawer with Otto

    Navigation is one of the most important things to consider when planning the architecture of an app. One type of navigation that has seen a good deal of success is the Navigation Drawer, officially released in the support library during the summer of 2013. The drawer is a swipeable section that can be populated with a view or a fragment, allowing for a lot of freedom when designing what sort of navigations you would want to include in your app, be it a standard list, a spinner, image buttons, etc. that can trigger events in your app, such as displaying a new fragment or activity.

Drawer with items for selecting a new fragment
    For this demo, I have put together a drawer with a list view that changes which fragment is displayed in the main activity on click. These navigation clicks fire events through Otto, a third party event bus library from Square, that lets the main activity know what fragment should be swapped in. All of the code for this working example can be found on GitHub.

    To start, you should declare the DrawerLayout widget in your activity layout file as the root element.

<android.support.v4.widget.DrawerLayout
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/drawer_layout"
    android:layout_width="match_parent"
    android:layout_height="match_parent">

    <FrameLayout
        android:id="@+id/container"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

    <com.ptrprograms.navigationdrawer.views.DrawerListView
              android:id="@+id/drawer"
              android:layout_width="240dp"
              android:layout_height="match_parent"
              android:layout_gravity="start"
              android:choiceMode="singleChoice"
              android:divider="@android:color/black"
              android:dividerHeight="0dp"
              android:background="@android:color/background_light" />

</android.support.v4.widget.DrawerLayout>

    The first child element within the drawer layout will be the main layout for your activity, while the second element will be the item within the drawer. In this case I am using a custom ListView. The width should be set to a specific value, such as 240dp, so that it only expands to a certain point on the device screen. The gravity and height for the drawer item should also be set to start and match_parent, respectively, as the drawer by convention should be on the left side of the screen. The other xml properties I have included in my DrawerListView element are simply style specific.

    Now that the layout for the drawer activity is set up, we need to programmatically configure the drawer to work in our main activity. The first step is to enable the home button on the action bar with these two commands:

getSupportActionBar().setDisplayHomeAsUpEnabled(true);
getSupportActionBar().setHomeButtonEnabled(true);

    It should be noted that for this demo I am using the ActionBarCompat support classes, but getSupportActionBar() can be changed to getActionBar() for projects supporting Android 3.0+. Next we need to create the ActionBarDrawerToggle and listener for when the drawer opens and closes:

private void initDrawer() {
    mDrawerToggle = new ActionBarDrawerToggle( this, mDrawerLayout,
            R.drawable.ic_navigation_drawer, R.string.drawer_open_title, R.string.drawer_close_title ) {

        @Override
        public void onDrawerClosed(View drawerView) {
            super.onDrawerClosed(drawerView);
            if( getSupportActionBar() == null )
                return;

            getSupportActionBar().setTitle( R.string.drawer_close_title );
            invalidateOptionsMenu();
        }

        @Override
        public void onDrawerOpened(View drawerView) {
            super.onDrawerOpened(drawerView);
            if( getSupportActionBar() == null )
                return;

            getSupportActionBar().setTitle( R.string.drawer_open_title );
            invalidateOptionsMenu();
        }
    };

    mDrawerLayout.setDrawerListener(mDrawerToggle);

}


    The toggle is created with a context, the DrawerLayout object, a drawable for the action bar top left icon (generally a hamburger icon, and can be generated through Android Asset Studio), and a set of action bar titles to display when the drawer is open or closed. ActionBarDrawerToggle also acts as a listener that supports onDrawerClosed and onDrawerOpened. This is where you can set the action bar titles and hide or show any menu items in the action bar.

Application with closed drawer title and extended hamburger icon (dark purple, top left)
    Two more functions must be added to the activity in order to ensure that the drawer stays open or closed on configuration changes, such as rotating the device:

@Override
protected void onPostCreate(Bundle savedInstanceState) {
    super.onPostCreate(savedInstanceState);
    mDrawerToggle.syncState();
}

@Override
public void onConfigurationChanged(Configuration newConfig) {
    super.onConfigurationChanged(newConfig);
    mDrawerToggle.onConfigurationChanged(newConfig);
}

    The final part of our main activity is a method that uses Otto to subscribe to particular events on the event bus. A separate method is then declared with a @Subscribe declaration to listen for a DrawerNavigationItemClickedEvent. The event has a String value called section that is used for selecting what fragment to display next. Once the fragment is selected and displayed, the drawer is programmatically closed.

@Subscribe
public void onDrawerNavigationClickedEvent( DrawerNavigationItemClickedEvent event ) {
    if( !mCurFragmentTitle.equalsIgnoreCase(event.section) ) {
        if (getString(R.string.fragment_image).equalsIgnoreCase(event.section)) {
            getSupportFragmentManager().beginTransaction().replace(R.id.container, ImageFragment.getInstance()).commit();
        } else if (getString(R.string.fragment_text).equalsIgnoreCase(event.section)) {
            getSupportFragmentManager().beginTransaction().replace(R.id.container, TextFragment.getInstance()).commit();
        } else if (getString(R.string.fragment_number_list).equalsIgnoreCase(event.section)) {
            getSupportFragmentManager().beginTransaction().replace(R.id.container, NumberListFragment.getInstance()).commit();
        }
        mCurFragmentTitle = event.section;
    }
    mDrawerLayout.closeDrawers();
}

    Now that the main activity is set up, let's go over Otto and how it's used to control navigation through the drawer. In the main activity's onCreate method we receive an instance of our navigation bus, which is created using a singleton pattern, and register it in onStart. This is the NavigationBus singleton class:

public class NavigationBus extends Bus {
    private static final NavigationBus navigationBus = new NavigationBus();

    public static NavigationBus getInstance() {
        return navigationBus;
    }

    private NavigationBus() {

    }

}

    In the custom ListView for the drawer, an OnItemClickListener is implemented and the title of the clicked item is pasted in an event over the NavigationBus.

@Override
public void onItemClick(AdapterView<?> adapterView, View view, int position, long l) {
    String drawerText = ( (DrawerItem) adapterView.getAdapter().getItem( position ) ).getDrawerText();
    NavigationBus.getInstance().post( new DrawerNavigationItemClickedEvent( drawerText ) );
}

    The DrawerNavigationItemClickedEvent is the same one that is listened for in our main activity, and that event is declared like so:

public class DrawerNavigationItemClickedEvent {

    public String section;

    public DrawerNavigationItemClickedEvent( String section ) {
        this.section = section;
    }

}

    As can be seen here, Otto allows an application to fire events from any component and catch it in any other component that is listening for that specific event. This makes things such as navigation and passing data from dialogs incredibly simple, saving development time and headaches.

    And with that, we now have a working navigation drawer that allows for changing fragments in our application and providing a proper drawer user experience. If there are any questions, please feel free to comment, otherwise I hope this tutorial is helpful for other developers out there.

Saturday, May 24, 2014

Creating a Native Video Player Activity

    One of the major uses for mobile devices is to watch videos. While YouTube is great for this, not all content is posted there. Given that sometimes proprietary online content will need to be played, I decided to make a simple native video player to play content from a URL. All code for this demo is available on GitHub.

Video player with an online video in landscape orientation
    The MainActivity for this application is a simple button that, when pressed, launches an intent with a URL for the native video player activity.

private void launchVideoPlayer() {
        Intent i = new Intent( this, VideoPlayerActivity.class );
        i.putExtra( VideoPlayerActivity.EXTRA_VIDEO_URL, "http://www.pocketjourney.com/downloads/pj/video/famous.3gp" );
        startActivity( i );
    }

    The layout for the video player activity consists of a VideoView and an indeterminate ProgressBar spinner. The spinner is shown while the video is buffered.

Loading spinner for the video player
    When the video player is created, MediaPlayer listeners and the URL that was passed through the intent are attached to the VideoView. 

mVideoView = (VideoView) findViewById( R.id.video_view );
mVideoView.setOnCompletionListener( onCompletionListener );
mVideoView.setOnErrorListener( onErrorListener );
mVideoView.setOnPreparedListener( onPreparedListener );

if( mVideoView == null ) {
    throw new IllegalArgumentException( "Layout must contain a video view with ID video_view" );
}

mUri = Uri.parse( getIntent().getExtras().getString( EXTRA_VIDEO_URL ) );
mVideoView.setVideoURI( mUri );

    A MediaController object is then created to add playback controls for the VideoView.

mMediaController = new MediaController( this );
mMediaController.setEnabled( true );
mMediaController.show();
mMediaController.setMediaPlayer( mVideoView );

    The OnCompletionListener resets the video to its initial position, pauses it and shows the playback controls.

protected MediaPlayer.OnCompletionListener onCompletionListener = new MediaPlayer.OnCompletionListener() {
    @Override
    public void onCompletion(MediaPlayer mediaPlayer) {
        mVideoView.seekTo( 0 );
        if( mVideoView.isPlaying() )
            mVideoView.pause();

        if( !mMediaController.isShowing() )
            mMediaController.show();
    }
};

    The OnPreparedListener is triggered when the video URL has been found and buffered. This is where the progress spinner is hidden and the media controller is attached to the VideoView, and the video is started.

protected MediaPlayer.OnPreparedListener onPreparedListener = new MediaPlayer.OnPreparedListener() {
    @Override
    public void onPrepared(MediaPlayer mediaPlayer) {
        if( mediaPlayer == null )
            return;
        mMediaPlayer = mediaPlayer;
        mediaPlayer.start();
        if( mSpinningProgressBar != null )
            mSpinningProgressBar.setVisibility( View.GONE );

        mVideoView.setMediaController( mMediaController );
    }
};

    Finally the OnErrorListener simply shows an AlertDialog that closes the video activity when the dialog is closed. Once the basic infrastructure is put together for playing the remote video, handling rotation and leaving/coming back to the app should be considered. In regards to hitting home and coming back to the app, the onPause and onResume methods are used to save the video position, restore it and set the spinner to visible. Given that the OnPreparedListener is still attached to the video view, that listener will be triggered when the video is ready to be played again and the app can continue as normal.

@Override
protected void onResume() {
    super.onResume();
    mVideoView.seekTo( mPosition );
    mSpinningProgressBar.setVisibility( View.VISIBLE );
}

@Override
protected void onPause() {
    super.onPause();
    mPosition = mVideoView.getCurrentPosition();
}

    For device rotation, we take advantage of the activity lifecycle and onSaveInstanceState and onRestoreInstanceState to save whether the video is playing and its position on rotation.

@Override
protected void onSaveInstanceState(Bundle outState) {
    super.onSaveInstanceState(outState);

    if( mVideoView == null || mVideoView.getCurrentPosition() == 0 )
        return;

    outState.putInt( EXTRA_CURRENT_POSITION, mVideoView.getCurrentPosition() );
    outState.putBoolean( EXTRA_IS_PLAYING, mVideoView.isPlaying() );
    mVideoView.pause();
}

@Override
protected void onRestoreInstanceState(Bundle savedInstanceState) {
    super.onRestoreInstanceState(savedInstanceState);
    if( mVideoView == null || savedInstanceState == null )
        return;

    if( savedInstanceState.getBoolean( EXTRA_IS_PLAYING, false ) ) {
        mVideoView.seekTo(savedInstanceState.getInt(EXTRA_CURRENT_POSITION, 0));
        mVideoView.start();
    }
}

Video in portrait orientation
    And with that, we have a simple native video player for remote content using native controls and some versatility. There's still a whole lot more that can be done with the video player, such as customizing UI controls, that I haven't touched on, but can add a good deal of polish to any media app.

Monday, May 19, 2014

Notifications Part 3: Android Wear Developer Preview

To wrap up the current set of posts about Android notifications, I will go over the Developers Preview for Wear. Wear is a standard Android based OS for wearable devices, namely wristwatches in this early stage, that will hopefully be expanded to other devices in the near future. More information on what Wear is can be found on Google's site.

That's cool and all, but what are the capabilities of Wear? So far the developer preview only allows for notifications to be sent to the watch emulator, and for some predefined interactions. Google has shown some currently unavailable features such as speech to text, and Wear is able to send messages back to an Android device through the use of lock screen remote view buttons (a subject deserving of its own post that I hope to get to) and intents. Given that a wristwatch device can trigger events on a phone, the possibilities become endless when matches up with cloud services or additional external hardware, such as Android Open Accessories.

Remote View Notification Button for controlling an audio service (awesome, right?)
The first thing that needs to be done in order to start using the Wear preview is to sign up here. Once that's all set, you'll receive an email within a day or two letting you know if you're in the beta, and where to download the Android Wear Preview app. Once the app is installed, go into the Android SDK manager and make sure you're using the latest Support Library, then create an Android Wear emulator using API 19+. You'll also need to download the wearable support library jar from Google (though it's also in the libs folder in the source code for this post). When that's done, connect your Android device to your computer through USB and in a terminal (CMD prompt for you Windows folks) type the following from your SDK platform-tools folder:

adb -d forward tcp:5601 tcp:5601

Assuming you opened the preview app and turned on allowing Wear to receive notifications, then we're ready to move on to the fun part of this post! As with my other posts, all of the source that I'll be talking about is available on GitHub.

All Wear notifications are essentially normal notification builder wrapped by WearableNotifications.Builder, which can then have additional features added to them. It should be noted that the original notification builder must use the Android support library version: NotificationCompat.Builder. Here is the code for wrapping the standard Builder with the WearNotifications Builder:

        NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder( getActivity() )
                .setSmallIcon( R.drawable.ic_launcher )
                .setLargeIcon( BitmapFactory.decodeResource( getResources(), R.drawable.batman_punching_shark ) )
                .setContentText( getString( R.string.big_content_summary ) )
                .setContentTitle( getString( R.string.notification_basic ) );

        Notification notification =
                new WearableNotifications.Builder( notificationBuilder )
                        .setHintHideIcon(true)
                        .build();


        mNotificationManager.notify( notificationId, notification );

Basic Android Wear Notification
Now that we're able to show notifications on Wear, let's move on to something a bit more useful: sending intents from Wear actions. This uses the same method as a standard Android notification for sending an intent: addAction. By calling addAction with a PendingIntent on the NotificationCompat.Builder, an additional screen is added to the Wear notification that sends the intent when clicked. In this example, the intent is a standard ACTION_VIEW that opens a browser to my blog.

        Intent intent = new Intent( Intent.ACTION_VIEW );
        intent.setData( Uri.parse( "http://ptrprograms.blogspot.com" ) );
        PendingIntent pendingIntent = PendingIntent.getActivity( getActivity(), 0, intent, 0 );

        NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder( getActivity() )
                .setSmallIcon( R.drawable.ic_launcher )
                .setLargeIcon( BitmapFactory.decodeResource( getResources(), R.drawable.batman_punching_shark ) )
                .setContentText( getString( R.string.big_content_summary ) )
                .setContentTitle( getString( R.string.notification_add_action ) )
                .addAction( R.drawable.ic_launcher, "Launch Blog", pendingIntent );

        Notification notification =
                new WearableNotifications.Builder( notificationBuilder )
                        .setHintHideIcon( true )
                        .build();

        mNotificationManager.notify( notificationId, notification );

Notification Action Item
The next kind of notification is called Quick Reply. By creating a RemoteInput object with a Strings array, and then calling addRemoteInputForContentIntent when building the WearableNotification, you can provide a list of up to five items to allow the user to easily respond to a notification.

        Intent intent = new Intent( getActivity(), MainActivity.class );
        PendingIntent pendingIntent = PendingIntent.getActivity( getActivity(), 0, intent, 0 );

        NotificationCompat.Builder notificationBuilder = new NotificationCompat.Builder( getActivity() )
                .setSmallIcon( R.drawable.ic_launcher )
                .setLargeIcon( BitmapFactory.decodeResource( getResources(), R.drawable.batman_punching_shark ) )
                .setContentText( getString( R.string.big_content_summary ) )
                .setContentTitle( getString( R.string.notification_quick_replies ) )
                .setContentIntent( pendingIntent );

        String replyLabel = "Transportation";
        String[] replyChoices = getResources().getStringArray( R.array.getting_around );

        RemoteInput remoteInput = new RemoteInput.Builder( "extra_replies" )
                .setLabel(replyLabel)
                .setChoices(replyChoices)
                .build();

        Notification notification =
                new WearableNotifications.Builder( notificationBuilder )
                        .setHintHideIcon( true )
                        .addRemoteInputForContentIntent( remoteInput )
                        .build();

        mNotificationManager.notify( notificationId, notification );
Action for Quick Replies
Quick Replies
The next kind of Wear notification uses multiple pages to present information to the user. The first page notification is built like the other notifications by constructing the Notification builder, and then additional notification pages can be constructed and built with optional styles.

        NotificationCompat.BigTextStyle additionalPageStyle = new NotificationCompat.BigTextStyle();
        additionalPageStyle.setBigContentTitle( "Page 2" );

        Notification secondPageNotification =
                new NotificationCompat.Builder( getActivity() )
                        .setStyle( additionalPageStyle )
                        .build();

The additional notifications are then added to a List of Notification objects, and added to the WearableNotification during construction with the .addPages( list ) method.

Page 2 of 4

The final type of notification for Android Wear are stackable notifications. These notifications are created by building WearNotifications, like the other examples, but with the additional .setGroup method with a standardized tag as the first parameter and a stack id value for the second parameter that must be different for each notification. At least one of the notifications should be a summary notification with an id of WearableNotifications.GROUP_ORDER_SUMMARY. Once all of these notifications are built, they can be posted using the NotificationManager.

Notification notification2 =
                new WearableNotifications.Builder( notificationBuilder )
                        .setHintHideIcon(true)
                        .setGroup( EXTRA_STACKED_GROUP, ++stackedId )
                        .build();

        Notification summaryNotification = new WearableNotifications.Builder( notificationBuilder )
                .setGroup( EXTRA_STACKED_GROUP, WearableNotifications.GROUP_ORDER_SUMMARY )
                .build();


Assuming Android Wear stays similar on release, it should be pretty straight forward and easy to integrate into any app, and the additional features of voice replies will make it even more useful.