Real-time Blend Demo for Windows Phone 8
Real-time Blend Demo is a Microsoft Developer example app demonstrating real-time use of the blend filter provided by the Lumia Imaging SDK. The selected texture, using the selected blend mode and blend global alpha, is applied to a stream of images from the phone's camera. The user can explore the results using different textures, blend modes and blend alpha. Furthermore, the sample is demonstrating usage of a gradient input image (GradientImageSource), meaning that one of the available textures is generated in-code instead of being a bitmap.
Version 1.2 adds blending of partial textures with drag, pinch-zoom, and rotate gestures for exact positioning, size, and orientation.
Compatibility
- Compatible with Windows Phone 8.
- Tested on Nokia Lumia 520, Nokia Lumia 1020 and Nokia Lumia 1520.
- Developed with Visual Studio 2013 Express for Windows Phone 8.
- Compiling the project requires the Lumia Imaging SDK.
Design
The app is dual-page, with a full-screen viewfinder and blend effect applied on the main page, and a texture selection page for changing the active blend texture. On the main viewfinder page, the camera preview image stream is blended with the selected texture using the selected blend mode and blend global alpha. You can change the blend mode by tapping on the left and right indicators in the application button bar, and you can adjust the blend effect level with the slider on the left. You can change active texture by tapping on the change texture button in the application button bar.
Figure 1. Lens flare texture and hardlight blend mode with global alpha close to maximum
Figure 2. Texture selection page allows user to select the texture to be used with the blend effect
There are two types of textures: fullscreen and partial. Selecting partial texture enables texture positioning: the user can move, scale and rotate the texture by drag and pinch gestures. Under the hood there are no differences, both fullscreen and partial textures are treated the same. The differentiation is on UI level: selecting a partial texture enables the gesture handling code. Textures are semitransparent bitmaps, with the exception of the red-green-blue gradient texture. It is technically not a bitmap, but a Lumia Imaging SDK GradientImageSource, meaning that the gradient input image, the texture, is generated in-code.
Architecture overview
The application structure consists of four key classes. The MainPage shows the camera preview image feed using VideoBrush and MediaElement, which gets the modified (with blend effect) image stream from CameraStreamSource, which in turn gets the modified frames from Effects that is connected with PhotoCaptureDevice.
Managing the camera preview image stream
The camera preview image stream is managed by the CameraStreamSource class. GetSampleAsync(...) method uses Effects.GetNewFrameAndApplyEffect(...) to get the modified camera buffer, and reports it to MediaElement with the protected MediaStreamSource.ReportGetSampleCompleted(...) method.
// ...
public class CameraStreamSource : MediaStreamSource
{
private readonly Dictionary<MediaSampleAttributeKeys, string> _emptyAttributes =
new Dictionary<MediaSampleAttributeKeys, string>();
private MediaStreamDescription _videoStreamDescription = null;
private MemoryStream _frameStream = null;
private ICameraEffect _cameraEffect = null;
private long _currentTime = 0;
private int _frameStreamOffset = 0;
private int _frameTime = 0;
private int _frameCount = 0;
private Size _frameSize = new Size(0, 0);
private int _frameBufferSize = 0;
private byte[] _frameBuffer = null;
// ...
protected override void OpenMediaAsync()
{
// Member variables are initialized here
// ...
}
protected override void GetSampleAsync(MediaStreamType mediaStreamType)
{
var task = _cameraEffect.GetNewFrameAndApplyEffect(_frameBuffer.AsBuffer(), _frameSize);
// When asynchroneous call completes, proceed by reporting about the sample completion
task.ContinueWith((action) =>
{
_frameStream.Position = 0;
_currentTime += _frameTime;
_frameCount++;
var sample = new MediaStreamSample(_videoStreamDescription, _frameStream, _frameStreamOffset,
_frameBufferSize, _currentTime, _emptyAttributes);
ReportGetSampleCompleted(sample);
}
}
// ...
}
Displaying the camera preview image stream
Displaying the modified camera preview image stream on the screen is handled by the MainPage class. In the XAML declaration of the class there is VideoBrush, which renders the LayoutRoot grid's background.
<Grid x:Name="LayoutRoot" Tap="LayoutRoot_Tap" ManipulationStarted="LayoutRoot_ManipulationStarted" ManipulationDelta="LayoutRoot_ManipulationDelta">
<Grid.Background>
<VideoBrush x:Name="BackgroundVideoBrush"/>
</Grid.Background>
...
</Grid>
In the C# code, an instance of CameraStreamSource is set as the source for MediaElement and then MediaElement is set as a source for VideoBrush:
// ...
public class MainPage : PhoneApplicationPage
{
private MediaElement _mediaElement = null;
private CameraStreamSource _cameraStreamSource = null;
// ...
private async void Initialize()
{
// Camera stream source is initialized here
// ...
_mediaElement = new MediaElement();
_mediaElement.Stretch = Stretch.UniformToFill;
_mediaElement.BufferingTime = new TimeSpan(0);
_mediaElement.SetSource(_cameraStreamSource);
BackgroundVideoBrush.SetSource(_mediaElement);
// ...
}
// ...
}
Applying the blend effect using a bitmap or an in-code generated gradient texture
Blend effect is applied by the GetNewFrameAndApplyEffect(...) method in Effects class. Notice that if there is no valid texture image URI set with the SetTexture(...) method, then the Initialize(...) method automatically generates the red-green-blue texture in-code, using the Lumia Imaging SDK's GradientImageSource.
public class Effects
{
private PhotoCaptureDevice _photoCaptureDevice = null;
private CameraPreviewImageSource _cameraPreviewImageSource = null;
private BlendEffect _blendEffect = null;
private Uri _blendImageUri = null;
private IImageProvider _blendImageProvider = null;
private Semaphore _semaphore = new Semaphore(1, 1);
// ...
public double GlobalAlpha { get; set; }
public PhotoCaptureDevice CaptureDevice
{
set
{
if (_photoCaptureDevice != value)
{
while (!_semaphore.WaitOne(100));
_photoCaptureDevice = value;
Initialize();
_semaphore.Release();
}
}
}
// ...
public async Task GetNewFrameAndApplyEffect(IBuffer frameBuffer, Size frameSize)
{
if (_semaphore.WaitOne(500))
{
var scanlineByteSize = (uint)frameSize.Width * 4; // 4 bytes per pixel in BGRA888 mode
var bitmap = new Bitmap(frameSize, ColorMode.Bgra8888, scanlineByteSize, frameBuffer);
if (_blendEffect != null)
{
_blendEffect.GlobalAlpha = GlobalAlpha;
var renderer = new BitmapRenderer(_blendEffect, bitmap);
await renderer.RenderAsync();
}
else
{
var renderer = new BitmapRenderer(_cameraPreviewImageSource, bitmap);
await renderer.RenderAsync();
}
// ...
_semaphore.Release();
}
}
public void SetTexture(Uri textureUri)
{
if (_semaphore.WaitOne(500))
{
Uninitialize();
_blendImageUri = textureUri;
Initialize();
_semaphore.Release();
}
}
private void Initialize()
{
_cameraPreviewImageSource = new CameraPreviewImageSource(_photoCaptureDevice);
if (_blendImageUri != null)
{
// Using the texture set with the SetTexture method
_blendImageProvider = new StreamImageSource((System.Windows.Application.GetResourceStream(_blendImageUri).Stream));
}
else
{
// No texture set with the SetTexture method, fallback to a in-code generated
// red-green-blue gradient texture
var colorStops = new GradientStop[]
{
new GradientStop() { Color = Color.FromArgb(0xFF, 0xFF, 0x00, 0x00), Offset = 0.0 }, // Red
new GradientStop() { Color = Color.FromArgb(0xFF, 0x00, 0xFF, 0x00), Offset = 0.7 }, // Green
new GradientStop() { Color = Color.FromArgb(0xFF, 0x00, 0x00, 0xFF), Offset = 1.0 } // Blue
};
var gradient = new RadialGradient(new Point(0, 0), new EllipseRadius(1, 0), colorStops);
var size = new Size(640, 480);
_blendImageProvider = new GradientImageSource(size, gradient);
}
switch (_effectIndex)
{
case 0:
{
EffectName = "1/16 - None";
}
break;
case 1:
{
EffectName = String.Format(nameFormat, _effectIndex + 1, AppResources.Filter_Blend_Normal);
_blendEffect = new BlendEffect(_cameraPreviewImageSource,_blendImageProvider, BlendFunction.Normal, GlobalAlpha);
}
break;
// ...
}
if (_blendEffect != null)
{
_blendEffect.TargetArea = _targetArea;
_blendEffect.TargetAreaRotation = _targetAreaRotation;
}
}
// ...
}
Positioning the blend effect
By default, the BlendEffect foreground image is scaled to fill the background image. The position and size of the foreground image can be defined by using the TargetArea property of BlendEffect. TargetArea specifies a rectangular area where the foreground image gets drawn. TargetArea coordinates are represented in unit coordinate space relative to the background image where (0, 0) is the top left corner and (1, 1) is the bottom right corner. The foreground image is scaled and stretched to fit TargetArea according to the TargetOutputOption property. TargetArea can also be rotated by setting the TargetAreaRotation property.
Figure 3: Positioned, rotated and scaled blend effect applied to cameraviewfinder image
Figure 4: Partial filter texture selection view
In the Real-time Blend demo, the textures selected from the "Partial" category can be moved, scaled, and rotated by using drag and pinch gestures. The gesture event handling is implemented in the MainPage class by handling manipulation events of the LayoutRoot element in the Design section method. TargetArea and TargetAreaRotation of BlendFilter are set in the SetTargetArea method of the Effects class. Note that semaphore is used to prevent modification of TargetArea while the filter is being rendered.
public void SetTargetArea(Rect targetArea, double targetAreaRotation)
{
if (_semaphore.WaitOne(500))
{
_targetArea = targetArea;
_targetAreaRotation = targetAreaRotation;
if (_blendFilter != null)
{
_blendFilter.TargetArea = targetArea;
_blendFilter.TargetAreaRotation = targetAreaRotation;
}
_semaphore.Release();
}
}
See also
Downloads
Real-time Blend Demo source code | real-time-blend-demo-master.zip |
This example application is hosted in GitHub, where you can check the latest activities, report issues, browse source, ask questions, or even contribute to the project yourself.