Lens Blur for Windows Phone 8
Lens Blur is an example app using the Lumia Imaging SDK Interactive Foreground Segmenter to first create a mask (foreground/background), and then apply a Lens Blur Effect (also known as "bokeh") to the areas that were marked as background using the user selectable blur kernel shape and size. Foreground segments are not blurred.
Compatibility
- Compatible with Windows Phone 8.
- Tested with Nokia Lumia 520, Nokia Lumia 1020 and Nokia Lumia 1520.
- Developed with Visual Studio 2013 Express for Windows Phone.
- Compiling the project requires the Lumia Imaging SDK.
Design
The app launches into a built-in gallery that provides the user with a couple of preselected photos. After the user selects a photo, the app navigates to a segmentation view, in which the user can use foreground/background colors to indicate what areas of the image should be considered as foreground and background. It's not necessary to draw specific borders, but only to give a general indication. Based on these annotations, the app automatically renders a preview of what it considers the segments to be, and the user can then modify if the correct segments were not found.
When segments look correct, the user can tap Accept to navigate to the actual lens blur view, in which the areas that were marked as background are blurred using the user selectable blur kernel shape and size. Foreground segments are not blurred, and this leads to a "bokeh" style effect in the result image. Tapping on the preview opens the photo in a pinch-zoomable fullscreen mode. Finally, the result image can be saved to the Photos library.
Architecture overview
The application architecture is quite simple, although only a rough overview is depicted here. GalleryPage provides a set of images bundled in the application package, and also has access to MediaLibrary for user's existing photos. SegmenterPage contains a drawing area in which the user can draw annotations. SegmenterPage also runs the InteractiveForegroundSegmenter effect after each annotation change and depicts the foreground/background segmentation on the screen. EffectPage contains a row of buttons and a slider to select the desired blur kernel shape and size, and the page also renders a preview of the blurred image after each parameter change. EffectPage also contains a saving functionality to save a high-resolution version of the result image to the Photos library. ZoomPage is accessible from the EffectPage, and provides a pinch-zoomable fullscreen image view. HelpPage and AboutPage contain information about how to use the application, and also some additional information.
Drawing annotations
The application model has declarations for the foreground and background colors, and holds the original photo opened from Photos, and also the generated annotations bitmap that is created on the SegmenterPage but consumed on the EffectPage.
namespace LensBlurApp.Models
{
public class Model
{
private static Stream _originalImageStream;
private static Bitmap _annotationsBitmap;
public static readonly SolidColorBrush ForegroundBrush = new SolidColorBrush(Colors.Red);
public static readonly SolidColorBrush BackgroundBrush = new SolidColorBrush(Colors.Blue);
// ...
}
}
Annotation drawing is implemented by stacking the original photo, a generated segment mask, persistent annotations canvas, a temporary (in progress) annotation canvas, and a touch sensitive manipulation area on top of each other, and drawing Polyline elements on the canvas.
<phone:PhoneApplicationPage ...>
...
<Grid HorizontalAlignment="Center" VerticalAlignment="Center" Margin="0,0,0,50">
...
<Image x:Name="OriginalImage" Stretch="Uniform" MaxWidth="456" MaxHeight="500"/>
<Image x:Name="MaskImage" Opacity="0.625" Stretch="Uniform"/>
<Canvas x:Name="AnnotationsCanvas">
<Canvas x:Name="CurrentAnnotationCanvas"/>
<Grid x:Name="ManipulationArea" Background="Transparent" Margin="-75"/>
</Grid>
...
</phone:PhoneApplicationPage>
Because the segmenter requires both foreground and background annotations before it can be successfully invoked, you verify this by seeing whether something has been drawn with both the foreground and background brushes.
namespace LensBlurApp.Pages
{
public partial class SegmenterPage : PhoneApplicationPage
{
// ...
private bool ForegroundAnnotationsDrawn
{
get
{
return AnnotationsCanvas.Children.Cast<Polyline>().Any(p => p.Stroke == Model.ForegroundBrush);
}
}
private bool BackgroundAnnotationsDrawn
{
get
{
return AnnotationsCanvas.Children.Cast<Polyline>().Any(p => p.Stroke == Model.BackgroundBrush);
}
}
private Point NearestPointInElement(double x, double y, FrameworkElement element)
{
var clampedX = Math.Min(Math.Max(0, x), element.ActualWidth);
var clampedY = Math.Min(Math.Max(0, y), element.ActualHeight);
return new Point(clampedX, clampedY);
}
// ...
}
}
During touch manipulation, the application draws the polyline shape first on the temporary annotation canvas so that it does not interfere with probably ongoing segment preview rendering.
namespace LensBlurApp.Pages
{
public partial class SegmenterPage : PhoneApplicationPage
{
private Polyline _polyline;
private SolidColorBrush _brush;
// ...
private void AnnotationsCanvas_ManipulationStarted(object sender,
System.Windows.Input.ManipulationStartedEventArgs e)
{
_manipulating = true;
_polyline = new Polyline
{
Stroke = _brush,
StrokeThickness = 6
};
var manipulationAreaDeltaX = ManipulationArea.Margin.Left;
var manipulationAreaDeltaY = ManipulationArea.Margin.Top;
var point = NearestPointInElement(
e.ManipulationOrigin.X + manipulationAreaDeltaX,
e.ManipulationOrigin.Y + manipulationAreaDeltaY,
AnnotationsCanvas);
_polyline.Points.Add(point);
CurrentAnnotationCanvas.Children.Add(_polyline);
}
private void AnnotationsCanvas_ManipulationDelta(object sender,
System.Windows.Input.ManipulationDeltaEventArgs e)
{
var manipulationAreaDeltaX = ManipulationArea.Margin.Left;
var manipulationAreaDeltaY = ManipulationArea.Margin.Top;
var x = e.ManipulationOrigin.X - e.DeltaManipulation.Translation.X + manipulationAreaDeltaX;
var y = e.ManipulationOrigin.Y - e.DeltaManipulation.Translation.Y + manipulationAreaDeltaY;
var point = NearestPointInElement(x, y, AnnotationsCanvas);
_polyline.Points.Add(point);
}
// ...
}
}
When the annotation is complete, the annotation is verified (it needs to contain more than two points) and then moved to the persistent annotations canvas.
namespace LensBlurApp.Pages
{
public partial class SegmenterPage : PhoneApplicationPage
{
// ...
private void AnnotationsCanvas_ManipulationCompleted(object sender,
System.Windows.Input.ManipulationCompletedEventArgs e)
{
if (_polyline.Points.Count < 2)
{
CurrentAnnotationCanvas.Children.Clear();
_manipulating = false;
}
else
{
CurrentAnnotationCanvas.Children.RemoveAt(CurrentAnnotationCanvas.Children.Count - 1);
AnnotationsCanvas.Children.Add(_polyline);
// ...
AttemptUpdatePreviewAsync();
}
_polyline = null;
}
// ...
}
}
Creating the segmentation bitmap based on annotations
On the SegmenterPage, the asynchronous AttemptUpdatePreviewAsync function creates the segmentation bitmap when requested. If changes to the annotations occur while processing the previous annotations, a new rendering is done immediately.
namespace LensBlurApp.Pages
{
public partial class SegmenterPage : PhoneApplicationPage
{
private bool _processingPending;
private bool _processing;
// ...
private async void AttemptUpdatePreviewAsync()
{
if (!Processing)
{
Processing = true;
do
{
_processingPending = false;
if (Model.OriginalImage != null && ForegroundAnnotationsDrawn && BackgroundAnnotationsDrawn)
{
Model.OriginalImage.Position = 0;
var maskBitmap = new WriteableBitmap(
(int)AnnotationsCanvas.ActualWidth,
(int)AnnotationsCanvas.ActualHeight);
var annotationsBitmap = new WriteableBitmap(
(int)AnnotationsCanvas.ActualWidth,
(int)AnnotationsCanvas.ActualHeight);
annotationsBitmap.Render(AnnotationsCanvas, new ScaleTransform
{
ScaleX = 1,
ScaleY = 1
});
annotationsBitmap.Invalidate();
Model.OriginalImage.Position = 0;
using (var source = new StreamImageSource(Model.OriginalImage))
using (var segmenter = new InteractiveForegroundSegmenter(source))
using (var renderer = new WriteableBitmapRenderer(segmenter, maskBitmap))
using (var annotationsSource = new BitmapImageSource(annotationsBitmap.AsBitmap()))
{
var foregroundColor = Model.ForegroundBrush.Color;
var backgroundColor = Model.BackgroundBrush.Color;
segmenter.ForegroundColor = Windows.UI.Color.FromArgb(
foregroundColor.A, foregroundColor.R,
foregroundColor.G, foregroundColor.B);
segmenter.BackgroundColor = Windows.UI.Color.FromArgb(
backgroundColor.A, backgroundColor.R,
backgroundColor.G, backgroundColor.B);
segmenter.Quality = 0.5;
segmenter.AnnotationsSource = annotationsSource;
await renderer.RenderAsync();
MaskImage.Source = maskBitmap;
maskBitmap.Invalidate();
Model.AnnotationsBitmap = (Bitmap)annotationsBitmap.AsBitmap();
}
}
else
{
MaskImage.Source = null;
}
}
while (_processingPending && !_manipulating);
Processing = false;
}
else
{
_processingPending = true;
}
}
}
}
Applying lens blur to the image background segments
After the segmentation is completed and the annotations bitmap is available, a blur effect can be applied on, for example, only the background areas in the image, creating a camera "bokeh"-like effect. In the Lens Blur app, this is done using the asynchronous AttemptUpdatePreviewAsync method on EffectPage.
namespace LensBlurApp.Pages
{
public partial class EffectPage : PhoneApplicationPage
{
private bool _processingPending;
private LensBlurPredefinedKernelShape _shape = LensBlurPredefinedKernelShape.Circle;
private bool _processing;
// ...
private void HeartButton_Click(object sender, RoutedEventArgs e)
{
if (_shape != LensBlurPredefinedKernelShape.Heart)
{
_shape = LensBlurPredefinedKernelShape.Heart;
AttemptUpdatePreviewAsync();
// ...
}
}
private async void AttemptUpdatePreviewAsync()
{
if (!Processing)
{
Processing = true;
Model.OriginalImage.Position = 0;
using (var source = new StreamImageSource(Model.OriginalImage))
using (var segmenter = new InteractiveForegroundSegmenter(source))
using (var annotationsSource = new BitmapImageSource(Model.AnnotationsBitmap))
{
segmenter.Quality = 0.5;
segmenter.AnnotationsSource = annotationsSource;
var foregroundColor = Model.ForegroundBrush.Color;
var backgroundColor = Model.BackgroundBrush.Color;
segmenter.ForegroundColor = Windows.UI.Color.FromArgb(
foregroundColor.A, foregroundColor.R,
foregroundColor.G, foregroundColor.B);
segmenter.BackgroundColor = Windows.UI.Color.FromArgb(
backgroundColor.A, backgroundColor.R,
backgroundColor.G, backgroundColor.B);
do
{
_processingPending = false;
var previewBitmap = new WriteableBitmap(
(int)Model.AnnotationsBitmap.Dimensions.Width,
(int)Model.AnnotationsBitmap.Dimensions.Height);
using (var effect = new LensBlurEffect(source, new LensBlurPredefinedKernel(_shape, (uint)SizeSlider.Value)))
using (var renderer = new WriteableBitmapRenderer(effect, previewBitmap))
{
effect.KernelMap = segmenter;
await renderer.RenderAsync();
PreviewImage.Source = previewBitmap;
previewBitmap.Invalidate();
}
}
while (_processingPending);
}
Processing = false;
}
else
{
_processingPending = true;
}
}
}
}
See also
- Interactive Foreground Segmenter overview
- Lens Blur Effect overview
- LensBlurEffect class
- InteractiveForegroundSegmenter class
Downloads
Lens Blur source code | lens-blur-master.zip |
This example application is hosted in GitHub, where you can check the latest activities, report issues, browse source, ask questions, or even contribute to the project yourself.