Xamarin: Introducing The Xamarin Show on Channel 9

Microsoft’s Channel 9 is host to a plethora of amazing videos and shows for developers, delivered directly from the people who are working hard behind the scenes each day at Microsoft. Today, I’m excited to be announcing a brand new weekly development show on Channel 9 aptly named, The Xamarin Show—a dedicated weekly show hosted […]

The post Introducing The Xamarin Show on Channel 9 appeared first on Xamarin Blog.

Details

Gerald Versluis: Be ahead! Test your apps with the latest iOS (beta) version

After installing the latest Xamarin.iOS beta build I got an error message while building.

Xamarin.iOS 10.0 SDK error message
Xamarin.iOS 10.0 SDK error message

‘This version of Xamarin.iOS requires the iOS 10.0 SDK (shipped with Xcode 8.0) when the managed linker is disabled. Either upgrade Xcode, or enable the managed linker.’

That’s pretty self-explanatory, right? So if you want to go quick and dirty just go into the project properties and enable the managed linker. But I thought to myself: ‘why not take this opportunity to test my app with the new iOS 10 as well and make sure it’s still working OK?’. So that’s what I did. And it wasn’t even that hard!

The big objection I had with this is that I did not want my production development environment would stop working. But as it turned out, you can leave it intact! Yay!

First thing you need to do is download the new Xcode version. You can do that from the Apple Developer portal. Just log in and in the lower left-hand side go to Downloads.

Click the nice blue Download button behind Xcode 8 beta x (6 in my case) and wait for the approximately 4 gb to come in.

Download Xcode beta version
Download Xcode 8 beta version

After it is downloaded, unzip it. You’ll see that the app is called ‘Xcode-beta’, so by default Xcode won’t be overridden, nice!

Just place the Xcode-beta in your Applications folder and start it. You’ll have to agree to some updated EULA’s and some stuff needs to be verified. While that is going on start Xamarin Studio on your Mac and go into the Preferences.

Xamarin Studio iOS SDK configuration
Xamarin Studio iOS SDK configuration

Find the SDK Locations node and click Apple. You’ll see the current location is the default one which points to the stable Xcode.app.

Just replace this with the freshly installed Xcode-beta.app, so just add ‘-beta’, and that’s it! Don’t forget to save the new preferences, wait for the Xcode beta to be up and running, restart your Visual Studio if you’re working with that and try to build again. You’ll see it now works!

Also, in the devices list you will now find the iOS 10 simulator images, so you can start testing and developing for that.
If you have some work to do on the stable Xcode and iOS SDKs, just go back into your Xamarin Studio and reset the Apple SDK to ‘Xcode.app’, restart Visual Studio if you use it, and you can work with that yet again!

Pretty easy right?!

Please note that you cannot submit builds of your iOS app which use the beta iOS SDK to the App Store! This can only be done after it has been released officially and you rebuild your app with the stable SDK.

Details

Xamarin: Xamarin.Android 7.0 Now With More Nougat

Android 7.0 Nougat brings several exciting features to the Android platform, including multi-window support, notification enhancements, data saver, and many new APIs, such as quick settings. We’re excited to announce that we have published Xamarin.Android support for Android 7.0 for both Xamarin Studio and Visual Studio. This release of Xamarin.Android is currently available in our […]

The post Xamarin.Android 7.0 Now With More Nougat appeared first on Xamarin Blog.

Details

Greg Shackles: Getting Started with Azure Functions and F#

While it’s been possible to use F# in Azure Functions for some time now, it wasn’t until this week that it really became a first class citizen. Previously it would execute your F# scripts by calling out to fsi, but now the runtime is fully available, including input and output bindings, making it a far more compelling option.

I recently built a somewhat complex “serverless” application using AWS Lambda and JavaScript, thinking to myself the entire time that I wished I could have been writing it in F#. In this world of event-driven functions a language like F# really shines, so I’m excited to see Microsoft embrace supporting it in Azure Functions. In this post I’ll walk through creating a simple Azure Functions application in F# that takes in a URL for an image, runs it through Microsoft’s Cognitive Services Emotion API, and overlays each face with an emoji that matches the detected emotion. This started out as an attempt to replicate Scott Hanselman’s demo in F#, but then I figured I may as well take it a step further while I was in there.

Initial Setup

While you can do a lot through the editor inside the Azure portal, for this demo I’m going to walk through creating an application that uses source control to handle deployments, since this is closer to what you’d be doing for any real application.

If you haven’t installed it already, you’ll want to install the azurefunctions npm package:

npm i -g azurefunctions  

This is a nice CLI tool the Azure Functions team maintains to help build and manage functions. I will also note that as of right now these things are all in a preview state and a bit of a moving target, so the experience isn’t without a few rough edges currently. I have no doubts these will be smoothed out over time.

With that installed, run func init to create a new Git repository with some initial files:

C:codegithubgshacklesfacemoji> func init  
Writing .gitignore  
Writing host.json  
Writing .secrets  
Initialized empty Git repository in C:/code/github/gshackles/facemoji/.git/


Tip: run func new to create your fSirst function.  

Next, commit that to your repository and push that out somewhere. In my case, I’m using GitHub.

In the Azure portal, go ahead and create a new Function App, and then under its settings choose to configure continuous integration. Connect the app to the Git repository you just created, which will allow Azure to automatically deploy your functions anytime you push.

Create The Function

Now we can actually start creating our function! From the command line, run func new:

C:codegithubgshacklesfacemoji [master +3 ~0 -0 !]> func new

     _-----_
    |       |    ????????????????????????????
    |--(o)--|    ?   Welcome to the Azure                      ?
   `---------´   ?   Functions generator!                      ?
    ( _´U`_ )    ????????????????????????????
    /___A___   /
     |  ~  |
   __'.___.'__
 ´   `  |° ´ Y `

? Select an option... List all templates
There are 50 templates available  
? Select from one of the available templates... QueueTrigger-FSharp
? Enter a name for your function... facemoji
Creating your function facemoji...  
Location for your function...  
C:codegithubgshacklesfacemojifacemoji


Tip: run `func run <functionName>` to run the function.  

This is one of those rough edges I mentioned – as of right now the only F# template in this tool is QueueTrigger-FSharp so we’ll choose that, even though it doesn’t match what we’re actually going to do. I’m sure this will be updated very soon with more up to date options.

In our case we’re going to use HTTP input and output instead of being driven by a queue, so update the contents of function.json to:

{
  "bindings": [
    {
      "type": "httpTrigger",
      "name": "req",
      "authLevel": "anonymous",
      "direction": "in"
    },
    {
      "type": "http",
      "name": "res",
      "direction": "out"
    }
  ],
  "disabled": false
}

We can also go ahead and add a project.json file to declare some NuGet dependencies:

{
    "frameworks": {
        "net46": {
            "dependencies": {
                "FSharp.Data": "2.3.2",
                "Newtonsoft.Json": "9.0.1"
            }
        }
    }
}

You’ll also want to copy in the PNG files found in my GitHub repository as well. Finally, go into your app settings and add a setting named EmotionApiKey with a value of the key you get from Cognitive Services.

Implement the Function

Okay, with all that out of the way, let’s actually implement this thing! The implementation of the function will go in run.fsx. Since this is F# we will build things out from top to bottom as small functions we can compose together. First we can pull in some references we’ll need:

#r "System.Drawing"

open System  
open System.IO  
open System.Net  
open System.Net.Http.Headers  
open System.Drawing  
open System.Drawing.Imaging  
open FSharp.Data  
open Newtonsoft.Json  

Next, create a few types to match the Cognitive Services API models and pull in some environment variables:

type FaceRectangle = { Height: int; Width: int; Top: int; Left: int; }  
type Scores = { Anger: float; Contempt: float; Disgust: float; Fear: float;  
                Happiness: float; Neutral: float; Sadness: float; Surprise: float; }
type Face = { FaceRectangle: FaceRectangle; Scores: Scores }

let apiKey = Environment.GetEnvironmentVariable("EmotionApiKey")  
let appPath = Path.Combine(Environment.GetEnvironmentVariable("HOME"), "site", "wwwroot", "facemoji")  

Originally I had wanted to use the JSON type provider to avoid needing Json.NET and these models but I ran into some issues there, another rough edge I suspect will be ironed out.

Next, we’ll need to parse the query string of the request sent to us, grab the image URL from it, and download the image into a byte array:

let getImageUrl (req: HttpRequestMessage) =  
    req.GetQueryNameValuePairs()
    |> Seq.find(fun pair -> pair.Key.ToLowerInvariant() = "url")
    |> fun pair -> pair.Value

let getImage url =  
    Http.Request(url, httpMethod = "GET")
    |> fun (imageResponse) -> 
        match imageResponse.Body with
        | Binary bytes -> bytes
        | _ -> failwith "expected binary response but received text"

With the image downloaded, we can send it to Cognitive Services to have it analyzed:

let getFaces bytes =  
    Http.RequestString("https://api.projectoxford.ai/emotion/v1.0/recognize",
        httpMethod = "POST",
        headers = [ "Ocp-Apim-Subscription-Key", apiKey ],
        body = BinaryUpload bytes)
    |> fun (json) -> JsonConvert.DeserializeObject<Face[]>(json)

Now that we have a list of faces in the image, we need to determine which emoji to show for each one:

let getEmoji face =  
    match face.Scores with
        | scores when scores.Anger > 0.1 -> "angry.png"
        | scores when scores.Fear > 0.1 -> "afraid.png"
        | scores when scores.Sadness > 0.1 -> "sad.png"
        | scores when scores.Happiness > 0.5 -> "happy.png"
        | _ -> "neutral.png"
    |> fun filename -> Path.Combine(appPath, filename)
    |> Image.FromFile

So now we have an image, a list of faces, and an accurate emoji to use for each. Let’s tie those together and draw the emoji on the image, returning a new image byte array:

let drawImage (bytes: byte[]) faces =  
    use inputStream = new MemoryStream(bytes)
    use image = Image.FromStream(inputStream)
    use graphics = Graphics.FromImage(image)

    faces |> Array.iter(fun face ->
        let rect = face.FaceRectangle
        let emoji = getEmoji face
        graphics.DrawImage(emoji, rect.Left, rect.Top, rect.Width, rect.Height)
    )

    use outputStream = new MemoryStream();
    image.Save(outputStream, ImageFormat.Jpeg)
    outputStream.ToArray()

Now we just need to return that image as an HTTP response:

let createResponse bytes =  
    let response = new HttpResponseMessage()
    response.Content <- new ByteArrayContent(bytes)
    response.StatusCode <- HttpStatusCode.OK
    response.Content.Headers.ContentType <- MediaTypeHeaderValue("image/jpeg")

    response

That’s all the plumbing we need here for our function, so all that’s left is to define the Run method that Azure Functions will actually invoke:

let Run (req: HttpRequestMessage) =  
    let bytes = getImage <| getImageUrl req

    getFaces bytes
    |> drawImage bytes
    |> createResponse

In less than 80 lines of code we’re taking a URL input, downloading an image, detecting faces and emotions, drawing emoji over each face, and returning the new image as an HTTP response. Let’s try it out!

Results

Let’s start out with an image that’s clearly full of anger:

Anger

Okay, let’s counter that with a nice happy train:

Happy

Nobody has ever known sadness quite like Jon Snow:

Sadness

And finally, Kevin McCallister to test out fear:

Fear

Not bad!

Not bad

All of the code for this app is available on GitHub.

Details

Xamarin: Xamarin Developer Events in September

It’s almost fall already! Now is the perfect time to find Xamarin developer events happening at your local mobile C# or .NET community user group to learn how you can build mobile apps for iOS, Android, and Windows in C# using Xamarin. If you hadn’t heard, Xamarin is now included in all editions of Visual […]

The post Xamarin Developer Events in September appeared first on Xamarin Blog.

Details