Categories: AndroidDevelopers

Audio,Video, and Camera Features in Android marshmallow

Android Marshmallow gives us good audio, video, and camera capabilities, and you can see that improvements have been made to enable and better support new or mint condition protocols or even change the behavior of some APIs, such as the camera service.

Audio features

Android Marshmallow 6.0 adds some enrichments to the audio features that we will cover in the upcoming sections.

Support for the MIDI protocol

The android.media.midi package was added in Android 6.0 (API 23).

With the new midi APIs, you can now send and receive MIDI (short for Musical Instrument Digital Interface) events in a much simpler way than earlier.

The package was built to provide us with capabilities to do the following:

  • Connect and use a MIDI keyboard
  • Connect to other MIDI controllers
  • Use external MIDI synthesizers, external peripherals, lights, show control, and so on
  • Allow dynamic music generation from games or music-creation apps
  • Allow the creation and passing of MIDI messages between apps
  • Allow Android devices to act as multi-touch controllers when connected to a laptop

When dealing with MIDI, you must declare it in the manifest, as follows:

<uses-feature android:name="android.software.midi" android:required="true"/>

Pay attention to the required part; in a manner similar to other features, setting it to true will make your app visible in the play store only if the device supports the MIDI API.

You can also check in runtime for MIDI support and then change the required part to false:

PackageManager pkgMgr = context.getPackageManager();
if (pkgMgr.hasSystemFeature(PackageManager.FEATURE_MIDI)) {
  //we can use MIDI API here as we know the device supports the MIDI API.
}

MidiManager

A way to properly use the MIDI API is via the MidiManager class; obtain it via context and use it when required:

MidiManager midiMgr = (MidiManager)context.getSystemService(Context.MIDI_SERVICE);

For more information, you can refer to:

https://developer.android.com/reference/android/media/midi/package-summary.html

Digital audio capture and playback

Two new classes have been added for digital audio capture and playback:

  • android.media.AudioRecord.Builder – digital audio capture
  • android.media.AudioTrack.Builder – digital audio playback

These will help configure the audio source and sink properties.

Audio and input devices

The new hasMicrophone() method has been added to the InputDevice class. This will report whether the device has a built-in microphone that developers can use. Let’s say you want to enable voice search from a controller connected to Android TV and you get an onSearchRequested() callback for the user’s search. You can then verify that there’s a microphone with the inputDevice object you get in the callback.

Information on audio devices

The new AudioManager.getDevices(int flags) method allows easy retrieval of all the audio devices currently connected to the system. If you want to be notified when there are audio device connections/disconnections, you can register your app to an AudioDeviceCallback callback via the AudioManager.registerAudioDeviceCallback(AudioDeviceCallback callback, Handler handler) method.

Changes in AudioManager

Some changes have been introduced in the AudioManager class, and they are as follows:

  • Using AudioManager to set the volume directly is not supported.
  • Using AudioManager to mute specific streams is not supported.
  • The AudioManager.setStreamSolo(int streamType, boolean state) method is deprecated. If you need exclusive audio playback, use AudioManager.requestAudioFocus(AudioManager.OnAudioFocusChangeListener l, int streamType, int durationHint).
  • The AudioManager.setStreamMute(int streamType, boolean state) method is deprecated. If you need to use AudioManager.adjustStreamVolume(int streamType, int direction, int flags) for direction, you can use one of the newly added constants.
  • ADJUST_MUTE will mute the volume. Note that it has no effect if the stream is already muted.
  • ADJUST_UNMUTE will unmute the volume. Note that it has no effect if the stream is not muted.

Video features

In Android Marshmallow, the video processing API has been upgraded with new capabilities. Some new methods and even a new class has been added just for developers.

android.media.MediaSync

The all new MediaSync class has been designed to help us with synchronous audio and video streams’ rendering. You can also use it to play audio- or video-only streams. You can use the dynamic playback rate and feed the buffers in a nonblocking action with a callback return. For more information on the proper usage, read:

https://developer.android.com/reference/android/media/MediaSync.html

MediaCodecInfo CodecCapabilities getMaxSupportedInstances

Now, we have a MediaCodecInfo.CodecCapabilities.getMaxSupportedInstances helper method to get the maximum number of supported concurrent codec instances. However, we must consider this only an upper bound. The actual number of concurrent instances can be lower depending on the device and the amount of available resources at the time of usage.

Why do we need to know this?

Let’s think of a case where we have a media-playing application and we want to add effects between the movies played. We will need to use more than one video codec, decode two videos, and encode one video stream back to be displayed on screen. Checking with this API will allow you to add more features that rely upon multiple instances of codecs.

MediaPlayer.setPlaybackParams

We can now set the media playback rate for fast or slow motion playback. This will give us the chance to create a funny video app where we slow down parts or play them fast, creating a new video while playing. Audio playing is synced accordingly, so you might hear a person talking slowly or even fast, for that matter.

Camera features

In Android Lollipop, there was the new Camera2 API, and now, in Android Marshmallow, there are a few more updates to the camera, flashlight, and image reprocessing features.

The flashlight API

Almost every device today has a camera, and almost every camera device has a flash unit. The setTorchMode() method has been added to control the flash torch mode.

The setTorchMode() method is used in the following manner:

CameraManager.setTorchMode (String cameraId, boolean enabled)

The cameraId element is the unique ID for the flash unit camera with which you want to change the torch mode. You can use getCameraIdList() to get the list of cameras and then use getCameraCharacteristics(String cameraId) to check whether flash is supported in that camera. The setTorchMode() method allows you to turn it on or off without opening the camera device and without requesting permission from the camera. The torch mode will be switched off as soon as the camera device becomes unavailable or when other camera resources that have the torch on become unavailable. Other apps can use the flash unit as well, so you need to check the mode when required or register a callback via the registerTorchCallback() method.

Refer to the sample app, Torchi, to see the entire code at:

https://github.com/MaTriXy/Torchi

Note

Turning on the torch mode may fail if the camera or other camera resources are in use.

The reprocessing API

As mentioned earlier, the Camera2 API was given a few boosts to allow added support for YUV and private opaque format image reprocessing. Before using this API, we need to check whether these capabilities are available. This is why we use the getCameraCharacteristics(String cameraId) method and check for the REPROCESS_MAX_CAPTURE_STALL key.

android.media.ImageWriter

This is a new class that’s been added to Android 6.0.

It allows us to create an image and feed it into a surface and then back to CameraDevice. Usually, ImageWriter is used along with ImageReader.

android.media.ImageReader

This is a new class that’s been added to Android 6.0.

It allows us direct access to the image data rendered in a surface. ImageReader, along with ImageWriter, allows our app to create an image feed from the camera to the surface and back to the camera for reprocessing.

Changes in the camera service

Android Marshmallow has made a change to the first come , first serve access model; now, the service access model has favorites processes—ones that are marked as high-priority. This change results in some more logic-related work for us developers. We need to make sure that we take into account a situation where we get bumped up (higher priority) or debunked (lower priority due to a change in our application).

Let’s try and explain this in a few simple bullets:

  • When you want to access camera resources or open and configure a camera device, your access is verified according to the priority of your application process. An application process with foreground activities (visible user) is normally given a higher priority, which in turn allows a better chance to get the desired access when needed.
  • On the other side of the priority scale, you can find low-priority apps that can and will be tossed aside (revoked from access) when a high-priority application attempts to use the camera. For example, when using the Camera API, you will get the onError() call when evicted, and when using the Camera2 API, you will get the onDisconnected() call when evicted.
  • Some devices out in the wild can allow separate applications to open and use separate camera devices simultaneously. The camera service now detects and disallows performance issues that are caused due to multiprocess usage. When the service detects such an issue, it will evict low-priority apps even if only one app is using that camera device.
  • In a multiuser environment, when switching users, all active apps using the camera in the previous user profile will be evicted in order to allow proper usage and access to apps for the current user. This means that switching users will stop the camera-using apps from using the camera for sure.

Deven Rathore

Deven is an Entrepreneur, and Full-stack developer, Constantly learning and experiencing new things. He currently runs CodeSource.io and Dunebook.com.

Published by
Deven Rathore

Recent Posts

6 Tips for Designing Your Unique Blog

If done correctly, a blog can be hugely successful; it can create a large following,…

2 weeks ago

How to Record and Transcribe a Google Hangouts Meet Video

As we all know, the current COVID-19 situation has brought the entire world to a…

2 weeks ago

5 Signs You Need To Hire A Website Designer

As the digital face of your business, how your website is perceived to the outside…

3 weeks ago

21 Chrome Extensions for Web Development

Since its introduction in 2008, Google Chrome has become the most used and the most…

3 weeks ago

15 cool React Admin Templates

As a react developer, building your Admin Template from scratch can be quite stressful and…

4 weeks ago

Web and App Frameworks Most Susceptible to Hacking

Hackers target web and app frameworks to try and exploit inbuilt weaknesses. Over the years,…

1 month ago