Build, Notarize, Repeat: Some tips to avoid endlessly failing to notarize your Xamarin Mac app.

As with most of my posts, this is a message to my future self who will have forgotten everything I did today, to get my Xamarin Mac app notarized. I hope it helps other people too.

Apple rejected my app from the store … please include specific macOS features that the application will use and not aggregated content. I’ve no idea what it means, but I interpreted it as we like Siri and we don’t like Alexa and your app brings Alexa to the Mac so go away. I needed to release it outside the store, and that means Notarization.

It took most of a day, and I had many errors. This is my story.

Disclaimer: These instructions worked for me as of 29 Jul 2019. The further out into the future you are reading this, the less likely these instructions are to be relevant.

Resources

This blog post by Microsoft’s David Ortinau is the best starting point. It took me 50% of the way there.

I also found this Github issue to be useful … as of writing it is still open.

Apple’s documentation on notarization and a corresponding trouble-shooting guide are useful if you can understand them. I finally felt like I understood them after I got everything working.

Install the right version of Xamarin.Mac

Although David’s post mentions installing a specific version of Xamarin.Mac (d16-1), at this point I think that release had made it into the stable branch, because I was able to get this working using the latest stable release:

The easy part: Update the plist and csproj

Per the instructions I edited my csproj to include the UseHardenedRuntime:

  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|Mac' ">
    <Optimize>true</Optimize>
    <OutputPath>bin\Release</OutputPath>
    <DefineConstants>__UNIFIED__;__MACOS__</DefineConstants>
    <ErrorReport>prompt</ErrorReport>
...
    <CodeSignEntitlements>Entitlements.plist</CodeSignEntitlements>
    <CodeSignProvision>Developer ID Mac PP</CodeSignProvision>
    <PlatformTarget>x86</PlatformTarget>
    <UseHardenedRuntime>true</UseHardenedRuntime>
  </PropertyGroup>

I also updated my entitlements.plist to allow jit (also Microphone access because how else are you going to talk to Alexa, and Location so that Alexa knows where you are when you ask for the time or weather, although it works if the user denies location access):

Sign your app with the appropriate cert and profile

This part was the trickiest for me. Apple’s documentation talks about using a Developer ID certificate, which I stupidly thought was just my developer certificate. no No NO. This is something different.

Creating the certificate and profile

Sign in to the Apple Developer web site, and open the Certificates, Identifiers & Profiles page and then Certificates.

Check the list of certificates you have for your Mac app. Do you see one with the TYPE called Developer ID Application?

If not, this is what you need. Tap the + at the top, and on the Create a New Certificate page in the Software section, select Developer ID Application option.

Tap continue and generate and download the certificate.

I am assuming that you have already set up an Identifier for your Mac App, since you’ve likely been running it locally to develop it already.

The final step is to create a profile which combines the Developer ID Application certificate and your App Identifier. Go to the Profiles section, tap the + in the title and select the Developer ID option in the Distribution section.

On the next page select the App Id for your app from the dropdown:

On the next page select the Developer ID Certificate that you created above:

Finally give your provisioning profile a name and generate and download it:

Ensuring they are downloaded to your machine

This is the thing that screwed me up I thought that every time I downloaded the certificate or provisioning profile to my machine using Apple’s web site, that was enough. It wasn’t.

What I finally figured out was that I also needed to open Apple’s Xcode app, go to Preferences and then Accounts, and tap Download Manual Profiles (sign in first if you’ve not already done so). This makes the profiles visible to Xcode and thus Visual Studio.

Signing your app

Now that you’ve done all of the above, all being well, you should be able to open Visual Studio on your Mac, go to your Mac Project’s settings, in the Mac Signing section and select the Application Developer ID Provisioning Profile you created.

Build your app in Release mode.

Uploading the binary to Apple

That was the easy part. Next comes what I really hope will also be easy for you too. You’ll be using some Apple command-line tools that need passwords, which you can generate as an app-specific password.

In all the examples below I’ll use appspecificpassword for my password. There is also a way to store the password in the keychain, which is perfect for CI type scenarios.

To upload the binary (the .app that was generated), open a terminal and navigate to your project’s bin/Release folder where you should find your project.app, which is actually a folder.

You need to upload it for authorization. First you need to zip it. Since my app is called Voice in a Can, my .app file is VoiceInACan.app

$ zip -r VoiceInACan.zip VoiceInACan.app
  adding: VoiceInACan.app/ (stored 0%)
...
  adding: VoiceInACan.app/Contents/Info.plist (deflated 32%)
  adding: VoiceInACan.app/Contents/PkgInfo (stored 0%)
$ 

To upload the zip file you created use the xcrun altool. You’ll need to use your Apple account. The bundle-id is just a placeholder. Use what you want:

$ xcrun altool --notarize-app --primary-bundle-id "com.atadore.VoiceInACanForMac.Zip" --username damian@mehers.com --password "appspecificpassword" --file VoiceInACan.zip
2019-07-30 08:55:54.038 altool[1624:421061] No errors uploading 'VoiceInACan.zip'.
RequestUUID = 46ecfa36-4338-48de-b9ac-087d14826fad
$

This doesn’t give a response immediately. It uploads the file for processing. Note the RequestUUID that is output by the tool (in this case 46ecfa36-4338-48de-b9ac-087d14826fad). You’ll need this below to get the status of the notarization.

Always an edge-case

There is a 99.9% chance that you won’t get the error I describe here, but I did. If your Apple ID is associated with more than one developer account you will get the error Your Apple ID account is attached to other iTunes providers. You will need to specify which provider you intend to submit content to by using the -itc_provider command.

To find the ids associated with your accounts use the iTMSTransporter command (again use your Apple ID):

$ /Applications/Xcode.app/Contents/Applications/Application\ Loader.app/Contents/itms/bin/iTMSTransporter -m provider -u damian@mehers.com -p appspecificpassword 
...
[2019-07-30 08:50:23 CEST] <main> DBG-X:   parameter Atadore SARL = AtadoreSARL

Provider listing:
   - Long Name -  - Short Name -
1  Atadore SARL   AtadoreSARL
...
$

After much grinding and processing this will eventually spit out a Provider listing with a long name and a Short Name. Take a note of the Short Name for the account you wish to use, and in every xcrun command append -itc_provider "The Short Name"

For example in my case the above command became:

$ xcrun altool --notarize-app --primary-bundle-id "com.atadore.VoiceInACanForMac.Zip" --username damian@mehers.com --password "appspecificpassword" --file VoiceInACan.zip -itc_provider "AtadoreSARL"
$

Getting the status

Once you’ve uploaded the app for notarization you will have to wait for it to be processed. Continually checking the status will definitely speed things up. To do so, use this command:

$ xcrun altool --notarization-info edab1ebc-29be-45f1-8bca-98b89066196f --username "damian@mehers.com" --password "appspecificpassword"  -itc_provider "AtadoreSARL" 
2019-07-30 09:05:53.314 altool[1799:441307] No errors getting notarization info.

   RequestUUID: edab1ebc-29be-45f1-8bca-98b89066196f
          Date: 2019-07-30 07:05:28 +0000
        Status: in progress
    LogFileURL: (null)
$ 

The GUID (edab1…) that is passed as a parameter is the one returned when you uploaded the zip file above.

After a while of repeating this, you will eventually get the processing response. And it will fail. It always fails the first time.

$ xcrun altool --notarization-info edab1ebc-29be-45f1-8bca-98b89066196f --username "damian@mehers.com" --password "appspecificpassword"  -itc_provider "AtadoreSARL" 
2019-07-30 09:07:14.468 altool[1803:445374] No errors getting notarization info.

   RequestUUID: edab1ebc-29be-45f1-8bca-98b89066196f
          Date: 2019-07-30 07:05:28 +0000
        Status: invalid
    LogFileURL: https://osxapps-ssl.itunes.apple.com/itunes-assets/Enigma123/v4/b2/f3/ed/b...d/developer_log.json?accessKey=1...D
   Status Code: 2
Status Message: Package Invalid
$ 

To get the details, use the curl command to show the contents of the LogFileURL:

$ curl https://osxapps-ssl.itunes.apple.com/itunes-assets/Enigma123/v4/b2/f3/ed/b...d/developer_log.json?accessKey=1...D
{
  "logFormatVersion": 1,
  "jobId": "e...f",
  "status": "Invalid",
  "statusSummary": "Archive contains critical validation errors",
  "statusCode": 4000,
  "archiveFilename": "VoiceInACan.zip",
  "uploadDate": "2019-07-30T07:05:28Z",
  "sha256": "c...3",
  "ticketContents": null,
  "issues": [
    {
      "severity": "error",
      "code": null,
      "path": "VoiceInACan.zip/VoiceInACan.app/Contents/MacOS/VoiceInACan",
      "message": "The binary is not signed with a valid Developer ID certificate.",
      "docUrl": null,
      "architecture": "x86_64"
    },
    {
      "severity": "error",
      "code": null,
      "path": "VoiceInACan.zip/VoiceInACan.app/Contents/MonoBundle/libMonoPosixHelper.dylib",
      "message": "The binary is not signed with a valid Developer ID certificate.",
      "docUrl": null,
      "architecture": "x86_64"
    },
    {
      "severity": "error",
      "code": null,
      "path": "VoiceInACan.zip/VoiceInACan.app/Contents/MonoBundle/libmono-native.dylib",
      "message": "The binary is not signed with a valid Developer ID certificate.",
      "docUrl": null,
      "architecture": "x86_64"
    }
  ]
}
$ 

I deliberately signed the app with the wrong provisioning profile to get this error. Remember you must use the profile that incorporates the Developer ID Application certificate.

To add insult to injury I also received a taunting email Your Mac software was not notarized..

You may have other errors. Perhaps you didn’t enable the hardened runtime, or allow JIT, or you use need to enable an entitlement.

This is the Live, Die, Repeat section where you keep on making changes, building, uploading, waiting for it to be processed, reading the reason why it failed, and trying again. For me it was just the certificate.

Once I used Developer ID Application certificate and generated a profile, and crucially used Xcode to Download Manual Profiles I was good:

$ xcrun altool --notarization-info 5f56223e-3d13-4e47-9d01-cc572ba1ca04 --username "damian@mehers.com" --password "fappspecificpassword" 
2019-07-30 09:24:15.210 altool[1953:476339] No errors getting notarization info.

   RequestUUID: 5f56223e-3d13-4e47-9d01-cc572ba1ca04
          Date: 2019-07-30 07:21:39 +0000
        Status: success
    LogFileURL: https://osxapps-ssl.itunes.apple.com/itunes-assets/Enigma123/v4/bd/7c/98/b...D
   Status Code: 0
Status Message: Package Approved
$ 

When it finally worked, I couldn’t believe it. I felt like someone who’d won the lottery and could not believe their luck, obsessively comparing the numbers on their ticket with the ones on the screen. But sure enough I received the celebratory email too:

I used the curl to check the log for warnings but all was good. My situation was actually trickier than I let on above because my app contains another app embedded inside it to allow my app be to launched at startup if the user wishes:

$ curl https://osxapps-ssl.itunes.apple.com/itunes-assets/Enigma123/v4/bd/7c/98/b...D
{
  "logFormatVersion": 1,
  "jobId": "5..4",
  "status": "Accepted",
  "statusSummary": "Ready for distribution",
  "statusCode": 0,
  "archiveFilename": "VoiceInACan.zip",
  "uploadDate": "2019-07-30T07:21:39Z",
  "sha256": "6..f",
  "ticketContents": [
    {
      "path": "VoiceInACan.zip/VoiceInACan.app/Contents/Library/LoginItems/AppleMacLoginHelper.app",
      "digestAlgorithm": "SHA-256",
      "cdhash": "5...1",
      "arch": "x86_64"
    },
    {
      "path": "VoiceInACan.zip/VoiceInACan.app",
      "digestAlgorithm": "SHA-256",
      "cdhash": "d...b",
      "arch": "x86_64"
    },
   ...
    {
      "path": "VoiceInACan.zip/VoiceInACan.app/Contents/MonoBundle/libmono-native.dylib",
      "digestAlgorithm": "SHA-256",
      "cdhash": "f...9",
      "arch": "x86_64"
    }
  ],
  "issues": null
}
$ 

Staple

The final step is to staple the notarization information to the app. You do this using the staple command:

$ xcrun stapler staple VoiceInACan.app
Processing: /Users/damian/Projects/AppleWatchAlexa/VoiceInACan.AppleMac/bin/Release/VoiceInACan.app
Processing: /Users/damian/Projects/AppleWatchAlexa/VoiceInACan.AppleMac/bin/Release/VoiceInACan.app
The staple and validate action worked!
$ 

You can now make your app available for download, and it should run on anyone’s Mac without them having to do anything special to their security settings.

Conclusions

Whilst the arrival of Catalyst is causing many questions, including the value of creating Xamarin Mac apps as opposed to simply bringing over an iPad app, there are some situations where native Mac apps make sense.

My app sits in the toolbar. It has a login helper to launch it at login. Two things that are likely not possible with Catalyst.

It is perhaps an edge-case, but in my experience everyone is an edge case, just in different ways. Maintaining the ability to create Xamarin Mac apps gives developers more flexibility and a chance to do something that otherwise would not be possible. Although I would understand if Microsoft dropped Xamarin Mac support, I’m hoping they won’t.

If you’d like to try out the Mac app, it is available here. And guess what? It is notarized!

Not even worth one star: Reacting constructively to App Store reviews

Instant rage

I’ve done it, you’ve done it, we’ve all done it. We’ve reacted with rage to some frustrating failure in an app, and left a negative, one star review. It’s a different experience though, when you are on the receiving end of these kinds of reviews. Trying to not immediately respond to the review with that same flash of anger takes experience, detachment, and maturity.

The user doesn’t understand the intricacies of your app. They don’t know of the hours you spent, not even sure if what you were trying to do was possible. Hundreds of evenings and weekends locked in a battle between what you imagined should be possible to program, and reality. All they know is that the app would not work properly, and guess what? It’s your fault.

I’m sure that some people get a feeling of release, letting go of the anger by leaving a one star review… it will hurt the developer, and makes the person leaving the review feel a little less upset. And you know what? They are right, as an app developer, when you receive a one-star review it does hurt, it does make you feel bad, and if you’re not careful, you’ll respond to the review in kind.

It’s not about the money

It is tempting to think that people that reacting with vitriol to a $1.99 app, are kind of getting things out of perspective. It’s only two bucks after all: the price of a cup of coffee. If you are on the US minimum wage, US$7.25/hour, two dollars is a lot of money.

But no matter what the app costs, it is not about the money. Nobody likes to feel like they have been cheated, and that is how people feel, rightly or wrongly, when they have an issue with an app, and it doesn’t behave as they expect. Cost is irrelevant. Nobody likes to feel like they have been taken for a fool.

You know what? They might be right

I try to keep an open mind to the possibility that sometimes they are reporting genuine issues. I cringe when I remember that time I unwittingly targeted people with small hands and bad eyesight: A code displayed to login to Amazon wasn’t visible if you had a small watch with the font size set to large. D’oh.

Then there are the people that genuinely need my app. The visually impaired or people with mobility issues. I did appallingly bad job in the first version of the app in supporting people who are visually impaired. It is kind of obvious with hindsight that I should have realised that an Alexa implementation for Watches, phones, tablets and computers might be useful for the visually impaired.

Responding

No matter whether the person has found a bug in the app, or if they simply don’t understand how software works, my approach to app store reviews is to try to be polite, kind, and empathise with the person, especially if it is clear that they are clueless and blaming you unfairly.

I’ve found that the passionately negative person, if reacted to with sympathy and genuine willingness to help can become your most passionate advocate and you can flip a person from hating to really supporting your app if you show them that you are listening, and that you are willing to work with them to try and address their issues.

Don’t respond immediately if you “know” how stupid, and idiotic, and clueless the person leaving the review is. Take a step back, move away from they keyboard, try and understand from the other person’s perspective. Try and understand where they are coming from, and leave a helpful response, assuming they best possible interpretation of their review.

Remember the many positive reviews you’ve received … try to not just focus on the negative. Don’t forget that you are doing this for the fun of it, don’t let them bring you down. Don’t spread the hate … try and understand.

Xamarin Tizen Networking: Under the covers of HTTP/2 in .NET

My current side/passion project requires the use of HTTP/2: It’s a .NET implementation of the Alexa Voice Service and I use it to drive Voice in a Can: Alexa for iOS, Apple Watch, Mac, Android, Android Wear, and … Tizen.

This isn’t an advert, but I do want to set the context. The Alexa Voice Service requires the use of HTTP/2 and this is a real-world product, not an academic excercise.

Why HTTP/2?

The reason the Alexa Voice Service requires HTTP/2 is that as well as the normal requests a client makes (send a request and get a response), the Alexa Voice Service specifies that a client keep a long-running downchannel HTTP/2 connection open so that it can use the HTTP/2 server push mechanism to send directives to the client.

For example when the client sends a request to recognize what is being said, it sends Recognise event to the Alexa Voice Service. This consists of a multipart mime message, the first part being JSON indicating that a recognize request is being sent, and the second part is binary data containing the audio samples from microphone (streamed).

Whilst the microphone data is being streamed, the Alexa Voice Service can detect that the person has stopped speaking (silence detection) and it uses the downchannel to asynchronously send a StopCapture directive, at which point the client stops recording and finishes the request.

So the HTTP/2 is a must. You can’t create an AVS client without supporting HTTP/2.

On platforms such as iOS, WatchOS, MacOS and Android I’ve abstacted out the HTTP functionality behind an interface, and used platform-specific code to implement the interface (NSUrlSession, OkHttp etc).

On Tizen I wanted to see if I could just use the .NET platform.

Forcing HTTP/2 to be used by the .NET HttpClient

The first challenge was to make the .NET HttpClient use HTTP/2.

This turned out to be surprisingly easy. I needed to specify the HttpRequestMessage.Version.

This was my original code for sending a message:

var content = stream == null ? null : new StreamContent(stream);
var request = new HttpRequestMessage(httpMethod, url) {
  Content = content,
  Version = new Version(2,0)
};
var response = await _httpClient.SendAsync(request, cancellationToken);

Notice how I’m setting the Version property.

Handling streamed responses as the data arrives

The second challenge is that by default the HttpClient waits for the complete response. This doesn’t work with the Alexa Voice Service because it streams responses. If you ask “Alexa, what is PI to 100 decimal places” you don’t want for the complete response to return before you start hearing the response … you want the response to stream and be played as it is received.

The solution to this was an additional parameter when calling SendAsync. You can specify whether you want the HttpClient to wait until the complete response is received, or just the HTTP headers, using the HttpCompletionOption.

var response = await _httpClient.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken);

Tizen uses the .NET Core UNIX HttpClient implementation

There were times when I wanted to look at how the Tizen HttpClient was implemented. One of the many delights of Xamarin and the “new” Microsoft is that pretty much everything is open source.

I went digging, expecting to find a Tizen HttpClientHandler but to my surprise, I found it was using the .NET Core UNIX HttpClient. The source is here (it uses Curl).

Enabling logging

One final tip. Sometimes you want to see what is happening under the hood. When looking through the source I found logging statements, and I wanted to see the logs, such as this code from the CurlHandler:

      static CurlHandler()
        {
            // curl_global_init call handled by Interop.LibCurl's cctor

            Interop.Http.CurlFeatures features = Interop.Http.GetSupportedFeatures();
            s_supportsSSL = (features & Interop.Http.CurlFeatures.CURL_VERSION_SSL) != 0;
            s_supportsAutomaticDecompression = (features & Interop.Http.CurlFeatures.CURL_VERSION_LIBZ) != 0;
            s_supportsHttp2Multiplexing = (features & Interop.Http.CurlFeatures.CURL_VERSION_HTTP2) != 0 && Interop.Http.GetSupportsHttp2Multiplexing() && !UseSingletonMultiAgent;

            if (NetEventSource.IsEnabled)
            {
                EventSourceTrace($"libcurl: {CurlVersionDescription} {CurlSslVersionDescription} {features}");
            }

To see these log messages I first declared a member _myEventListener member which is an EventListener:

private MyEventListener _myEventListener;

Then later in my code I initialized the _myEventListener:

  var netEventSource = EventSource.GetSources().FirstOrDefault(es => es.Name == "Microsoft-System-Net-Http");
  if (netEventSource != null && _myEventListener == null) {
    _myEventListener = new MyEventListener();
    _myEventListener.EnableEvents(netEventSource, EventLevel.LogAlways);
  }

The event listener is declared like this. Note the filtering of a couple of hard-coded strings that were poluting my output:

class MyEventListener : EventListener {
  protected override void OnEventWritten(EventWrittenEventArgs eventData) {
    var memberNameIndex = eventData.PayloadNames.IndexOf("memberName");

    var memberName = memberNameIndex == -1 ? null : eventData.Payload[memberNameIndex].ToString();

    var message = new StringBuilder();
    for (var i = 0; i < eventData.Payload.Count; i++) {
      if(i == memberNameIndex) continue;
      if (i > 0) {
        message.Append(", ");
      }
      message.Append(eventData.PayloadNames[i] + "=" + eventData.Payload[i]);
    }

    var last = eventData.Payload.Last().ToString();

    if(last == "Ask libcurl to perform any available work...") return;
    if (last == "...done performing work: CURLM_OK") return;
    if(string.IsNullOrWhiteSpace(last)) return;

    if (memberName == null) {
      Log.D(message);
    } else {
      // ReSharper disable once ExplicitCallerInfoArgument
      Log.D(message, memberName, "CurlHandler");
    }
  }
}

My logger uses Tizen.Log.Debug("viac", message, "", "",0); to output to the log, using the Tizen system Log class.

I used this command line to view the log:

sdb dlog viac:D" or "sdb dlog viac:D`

An extract of the output it all its glory:

D/viac    ( 7582):  18:30:26 []  TizenNetworkImpl MakeHttpRequest Sending...
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync thisOrContextObject=HttpClient#52727599, parameters=(Method: GET, RequestUri: 'https://avs-alexa-na.amazon.com/v20160207/directives', Version: 2.0, Content: <null>, Headers:
D/viac    ( 7582): {
D/viac    ( 7582):   Authorization: Bearer ...
D/viac    ( 7582): })
D/viac    ( 7582):  18:30:26 []  CurlHandler .ctor thisOrContextObject=CurlResponseMessage#51192825, parameters=(OK)
D/viac    ( 7582):  18:30:26 []  CurlHandler RequestMessage thisOrContextObject=CurlResponseMessage#51192825, first=CurlResponseMessage#51192825, second=HttpRequestMessage#38539564
D/viac    ( 7582):  18:30:26 []  CurlHandler Content thisOrContextObject=CurlResponseMessage#51192825, first=CurlResponseMessage#51192825, second=NoWriteNoSeekStreamContent#64971671
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync handlerId=26756241, workerId=4, requestId=5, message=Method: GET, RequestUri: 'https://avs-alexa-na.amazon.com/v20160207/directives', Version: 2.0, Content: <null>, Headers:
D/viac    ( 7582): {
D/viac    ( 7582):   Authorization: Bearer ...
D/viac    ( 7582): }
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync thisOrContextObject=HttpClient#52727599, result=System.Threading.Tasks.Task`1[System.Net.Http.HttpResponseMessage]

Final thoughts

When I first learned to program I spent evening after evening of focused hours trying to break the copy-protection on 8-bit games, not to steal them (I’d already bought them), but to try to disssassemble them in order to work out how to get infinite lives.

I often think that despite the formal training I later received getting a degree in computer science, those childhood hours of fierce focused concentration, trying to accomplish something I wasn’t even sure was possible, was the best training I ever had.

I had no idea whether I could get the Alexa Voice Service running on Tizen, whether I could get HTTP/2 working, or a myriad other things. Sometimes you just have to keep trying, having faith in your abilities, continually trying different approaches, until eventually, one day:

Auto launching Xamarin Mac apps at login

I have an app, called Voice in a Can, which lets you use Alexa on your Apple Watch and iPhone. I’m working on bringing it to the Mac, and one of the things I want is that it be started at login, if the user wants this.

To do this in a sandboxed app, you need to create a helper app, and bundle it inside your main app, in a specific location (/Contents/Library/LoginItems). This helper app is automatically launched at startup, and has no UI – all it does is launch the main app, which in my case sits as an icon in the system toolbar.

There is a great blog post on how to do this by Artur Shamsutdinov, which this post is based on. This blog post adds some detail, information on how to use MSBuild, and trouble-shooting information. You really should check out Artur’s post too.

I created a main application, in my case it is called VoiceInACan.AppleMac:

I made sure this was signed, and configured to use the SandBox.

In my AppDelegate I called SMLoginItemSetEnabled to tell MacOS to launch my helper app at startup (the com.atadore.VoiceInACanForMacLoginHelper is the bundle ID of my helper app, defined below) :

    [DllImport("/System/Library/Frameworks/ServiceManagement.framework/ServiceManagement")]
    static extern bool SMLoginItemSetEnabled(IntPtr aId, bool aEnabled);

    public static bool StartAtLogin(bool value) {
      CoreFoundation.CFString id = new CoreFoundation.CFString("com.atadore.VoiceInACanForMacLoginHelper");
      return SMLoginItemSetEnabled(id.Handle, value);
    }

    public override void DidFinishLaunching(NSNotification notification) {
      ...
      var worked = StartAtLogin(true);
      ...

In a real app you’ll not want to auto-launch a Sandboxed app without permission from the user since your app will be rejected by App Review when you submit it.

I created a helper Mac app, as another project, in my case called VoiceInACan.AppleMacLoginHelper

I made sure this was signed, and configured to use the SandBox

I edited the storyboard to uncheck Is Initial Controller (in the properties on the right) to ensure the helper app has no UI:

I updated Info.plist to indicate the app was background only (because it will have no UI and serve purely to launch my main app on startup):

I added a dependency from my main app to the helper app by right-clicking on References in my main app, selecting Edit References, going to the Projects tab and checking the checkbox next to my helper app:

This ensures that the helper app is built before my main app.

In my AppDelegate.cs in my helper app, I launch my main app:

using System.Linq;
using AppKit;
using Foundation;

namespace AppleMacLoginHelper {
  [Register("AppDelegate")]
  public class AppDelegate : NSApplicationDelegate {
    public AppDelegate() {
    }

    public override void DidFinishLaunching(NSNotification notification) {
      System.Console.WriteLine("ViacHelper: starting");
      if (!NSWorkspace.SharedWorkspace.RunningApplications.Any(a => a.BundleIdentifier == "com.atadore.VoiceInACanForMac")) {
        System.Console.WriteLine("ViacHelper: Got bundle");
        var path = new NSString(NSBundle.MainBundle.BundlePath)
            .DeleteLastPathComponent()
            .DeleteLastPathComponent()
            .DeleteLastPathComponent()
            .DeleteLastPathComponent();
        var pathToExecutable = path + @"Contents/MacOS/VoiceInACan";
        System.Console.WriteLine("ViacHelper: Got path: " + pathToExecutable);

        if (NSWorkspace.SharedWorkspace.LaunchApplication(pathToExecutable)) {
          System.Console.WriteLine("ViacHelper: Launched: " + pathToExecutable);
        } else {
          NSWorkspace.SharedWorkspace.LaunchApplication(path);
          System.Console.WriteLine("ViacHelper: Launched: " + path);
        }
      }

      System.Console.WriteLine("ViacHelper: dying");
      NSApplication.SharedApplication.Terminate(this);
    }

    public override void WillTerminate(NSNotification notification) {
      // Insert code here to tear down your application
    }
  }
}

I updated my main app to embed the helper app within it

So far I’ve created two apps: the main app, which provides my main functionality (in my case Alexa), and a helper app which has no functionality other than to launch the main app. In order for the SMLoginItemSetEnabled to work the helper app needs to be embeded within the main app.

To do this, I edited the csproj of my main app, and added markup to embed the main app. Here are the bits, the complete thing is below:

First, define an ItemGroup that references all the files in the helper app’s bundle (the Configuration refers to Debug or Release):

  <ItemGroup>
    <HelperApp Include="$(ProjectDir)/../VoiceInACan.AppleMacLoginHelper/bin/$(Configuration)/AppleMacLoginHelper.app/**" />
  </ItemGroup>

Next, copy those files into the right place in the main app (note that this is done after _CopyContentToBundle so that it is copied before the build signs the final bundle):

  <Target Name="CopyHelper" AfterTargets="_CopyContentToBundle">
    <Message Text="Copying helper app" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library/LoginItems" />
    <Copy SourceFiles="@(HelperApp)" DestinationFiles="@(HelperApp->'$(AppBundleDir)/Contents/Library/LoginItems/AppleMacLoginHelper.app/%(RecursiveDir)%(Filename)%(Extension)')" />
  </Target>

Finally, the embeded bundle’s files can be signed (this may not be necessary … first try without this):

  <Target Name="CodeSignHelper" AfterTargets="CopyHelper">
    <Message Text="Signing helper app" />
    <Codesign SessionId="$(BuildSessionId)" ToolExe="$(CodesignExe)" ToolPath="$(CodesignPath)" CodesignAllocate="$(_CodesignAllocate)" Keychain="$(CodesignKeychain)" Resources="$(AppBundleDir)/Contents/Library/LoginItems/AppleMacLoginHelper.app" SigningKey="$(_CodeSigningKey)" ExtraArgs="$(CodesignExtraArgs)">
    </Codesign>
  </Target>

This is my complete modification to my csproj (after the import of the Xamarin.Forms.targets):

  <Import Project="..\packages\Xamarin.Forms.3.3.0.912540\build\Xamarin.Forms.targets" Condition="Exists('..\packages\Xamarin.Forms.3.3.0.912540\build\Xamarin.Forms.targets')" />
  <ItemGroup>
    <HelperApp Include="$(ProjectDir)/../VoiceInACan.AppleMacLoginHelper/bin/$(Configuration)/AppleMacLoginHelper.app/**" />
  </ItemGroup>
  <Target Name="CopyHelper" AfterTargets="_CopyContentToBundle">
    <Message Text="Copying helper app" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library/LoginItems" />
    <Copy SourceFiles="@(HelperApp)" DestinationFiles="@(HelperApp->'$(AppBundleDir)/Contents/Library/LoginItems/AppleMacLoginHelper.app/%(RecursiveDir)%(Filename)%(Extension)')" />
  </Target>
   <Target Name="CodeSignHelper" AfterTargets="CopyHelper">
    <Message Text="Signing helper app" />
    <Codesign SessionId="$(BuildSessionId)" ToolExe="$(CodesignExe)" ToolPath="$(CodesignPath)" CodesignAllocate="$(_CodesignAllocate)" Keychain="$(CodesignKeychain)" Resources="$(AppBundleDir)/Contents/Library/LoginItems/AppleMacLoginHelper.app" SigningKey="$(_CodeSigningKey)" ExtraArgs="$(CodesignExtraArgs)">
    </Codesign>
  </Target>

</Project>

Finally copy your main app’s bundle to the Application folder, and run it so that it registers the embedded helper to start on login.

Troubleshooting SMLoginItemSetEnabled

The first challenge is getting log information. If you run the Console app, it only shows you information from after it was launched, which is after you login. You can get historical information, from the terminal

sudo log collect --last 1d
open system_logs.logarchive

This will show you the last day’s worth of logs. You’ll want to look for messages from otherbsd

The second challenge I faced was that although I registered the startup item properly, it wasn’t being launched properly. I was getting this cryptic error Could not submit LoginItem job com.atadore.VoiceInACanForMacLoginHelper: 119: Service is disabled:

After Googling, I discovered the lsregister command, and was able to see many many “registrations” of my helper app, from developing and backups etc

/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -dump | grep AppleMacLoginHelper.app | more

What fixed it for me, and your millage may vary, and you should really check what these commands do before executing them, was:

/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -gc
/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -kill

I then re-ran my main app, which re-registered my helper app as a single entry in lsregister and joy, my app launches at startup. I started working on this yesterday at 7:30 am and got it working around 1:30 pm. I’m hoping if you need to do something similar this post will shave a little time off your experience!

Acknowledgements

There is no way I’d have got this working without Artur Shamsutdinov’s blog post from 2016.

Running Xamarin Forms apps on the new Tizen 4.0 Samsung Galaxy Watch

I picked up a new Samsung Galaxy Watch (SM-R800) today, and after spending an evening on it, I managed to deploy and run a Xamarin Forms (Tizen 4.0) app on it … I just tried the default template:

In case it helps someone else, these are some of the things I did. FWIW I’m using Windows running in Parallels on a Mac.

  1. Install the Tizen tools for Visual Studio, and create a new Tizen XML App (Xamarin Forms)
  2. Enable development mode on the watch by tapping the software version
  3. Enable Wifi on the watch, and note the IP address
  4. Run the Device Manager (Tools|Tizen|Device Manager) and use the Scan button … this should detect your watch (It didn’t initially for me because I’d forgotten to set my Windows network to Private)
  5. Run the Tizen Package Manager (Tools|Tizen) and ensure you have Samsung Certificate Extension installed under Extension SDK
  6. Run the Tizen Certificate Manager (Tools Tizen). Click the “+”. If you don’t see Samsung listed then check the previous step. Choose Samsung and run through all the steps (including signing in with a Samsung account).
  7. This is the part that tripped me up. Under Tools|Options|Tizen ensure you have “Sign the .TPK file…” checkbox checked:
  8. Build and Run (I got a hang running with the debugger, but when I started without debugging it worked.). You should see the watch as the device in Visual Studio:

I’m sure I’ve forgotten something … it was a long night getting this running so feel free to reply and I’ll see if I can help.

Screencast: Your computer screen as an Alexa Smart Home Security Camera

This is a screencast I just put together showing how you can show your computer’s screen as an Alexa Smart Home Security Camera.

I wanted this because I already have security camera software running on a windows desktop … all I wanted was to say “Alexa, show security cameras” and see the software running on that computer.

Source referenced in the screencast is here

Using Siri to control your Alexa Smart Home devices

I have many Smart Home devices that can be controlled from my Amazon Echo, however none of those devices can be controlled from Siri on my Apple Watch or iPhone. None are HomeKit compatible.

What I’ve done lets me control my Alexa Smart Home devices via Siri on my Apple Watch or iPhone. This solution is not elegant (it involves a Raspberry PI, HomeBridge and a speaker) but it does work…

Code here. Demo here: