Not even worth one star: Reacting constructively to App Store reviews

Instant rage

I’ve done it, you’ve done it, we’ve all done it. We’ve reacted with rage to some frustrating failure in an app, and left a negative, one star review. It’s a different experience though, when you are on the receiving end of these kinds of reviews. Trying to not immediately respond to the review with that same flash of anger takes experience, detachment, and maturity.

The user doesn’t understand the intricacies of your app. They don’t know of the hours you spent, not even sure if what you were trying to do was possible. Hundreds of evenings and weekends locked in a battle between what you imagined should be possible to program, and reality. All they know is that the app would not work properly, and guess what? It’s your fault.

I’m sure that some people get a feeling of release, letting go of the anger by leaving a one star review… it will hurt the developer, and makes the person leaving the review feel a little less upset. And you know what? They are right, as an app developer, when you receive a one-star review it does hurt, it does make you feel bad, and if you’re not careful, you’ll respond to the review in kind.

It’s not about the money

It is tempting to think that people that reacting with vitriol to a $1.99 app, are kind of getting things out of perspective. It’s only two bucks after all: the price of a cup of coffee. If you are on the US minimum wage, US$7.25/hour, two dollars is a lot of money.

But no matter what the app costs, it is not about the money. Nobody likes to feel like they have been cheated, and that is how people feel, rightly or wrongly, when they have an issue with an app, and it doesn’t behave as they expect. Cost is irrelevant. Nobody likes to feel like they have been taken for a fool.

You know what? They might be right

I try to keep an open mind to the possibility that sometimes they are reporting genuine issues. I cringe when I remember that time I unwittingly targeted people with small hands and bad eyesight: A code displayed to login to Amazon wasn’t visible if you had a small watch with the font size set to large. D’oh.

Then there are the people that genuinely need my app. The visually impaired or people with mobility issues. I did appallingly bad job in the first version of the app in supporting people who are visually impaired. It is kind of obvious with hindsight that I should have realised that an Alexa implementation for Watches, phones, tablets and computers might be useful for the visually impaired.


No matter whether the person has found a bug in the app, or if they simply don’t understand how software works, my approach to app store reviews is to try to be polite, kind, and empathise with the person, especially if it is clear that they are clueless and blaming you unfairly.

I’ve found that the passionately negative person, if reacted to with sympathy and genuine willingness to help can become your most passionate advocate and you can flip a person from hating to really supporting your app if you show them that you are listening, and that you are willing to work with them to try and address their issues.

Don’t respond immediately if you “know” how stupid, and idiotic, and clueless the person leaving the review is. Take a step back, move away from they keyboard, try and understand from the other person’s perspective. Try and understand where they are coming from, and leave a helpful response, assuming they best possible interpretation of their review.

Remember the many positive reviews you’ve received … try to not just focus on the negative. Don’t forget that you are doing this for the fun of it, don’t let them bring you down. Don’t spread the hate … try and understand.

Xamarin Tizen Networking: Under the covers of HTTP/2 in .NET

My current side/passion project requires the use of HTTP/2: It’s a .NET implementation of the Alexa Voice Service and I use it to drive Voice in a Can: Alexa for iOS, Apple Watch, Mac, Android, Android Wear, and … Tizen.

This isn’t an advert, but I do want to set the context. The Alexa Voice Service requires the use of HTTP/2 and this is a real-world product, not an academic excercise.

Why HTTP/2?

The reason the Alexa Voice Service requires HTTP/2 is that as well as the normal requests a client makes (send a request and get a response), the Alexa Voice Service specifies that a client keep a long-running downchannel HTTP/2 connection open so that it can use the HTTP/2 server push mechanism to send directives to the client.

For example when the client sends a request to recognize what is being said, it sends Recognise event to the Alexa Voice Service. This consists of a multipart mime message, the first part being JSON indicating that a recognize request is being sent, and the second part is binary data containing the audio samples from microphone (streamed).

Whilst the microphone data is being streamed, the Alexa Voice Service can detect that the person has stopped speaking (silence detection) and it uses the downchannel to asynchronously send a StopCapture directive, at which point the client stops recording and finishes the request.

So the HTTP/2 is a must. You can’t create an AVS client without supporting HTTP/2.

On platforms such as iOS, WatchOS, MacOS and Android I’ve abstacted out the HTTP functionality behind an interface, and used platform-specific code to implement the interface (NSUrlSession, OkHttp etc).

On Tizen I wanted to see if I could just use the .NET platform.

Forcing HTTP/2 to be used by the .NET HttpClient

The first challenge was to make the .NET HttpClient use HTTP/2.

This turned out to be surprisingly easy. I needed to specify the HttpRequestMessage.Version.

This was my original code for sending a message:

var content = stream == null ? null : new StreamContent(stream);
var request = new HttpRequestMessage(httpMethod, url) {
  Content = content,
  Version = new Version(2,0)
var response = await _httpClient.SendAsync(request, cancellationToken);

Notice how I’m setting the Version property.

Handling streamed responses as the data arrives

The second challenge is that by default the HttpClient waits for the complete response. This doesn’t work with the Alexa Voice Service because it streams responses. If you ask “Alexa, what is PI to 100 decimal places” you don’t want for the complete response to return before you start hearing the response … you want the response to stream and be played as it is received.

The solution to this was an additional parameter when calling SendAsync. You can specify whether you want the HttpClient to wait until the complete response is received, or just the HTTP headers, using the HttpCompletionOption.

var response = await _httpClient.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken);

Tizen uses the .NET Core UNIX HttpClient implementation

There were times when I wanted to look at how the Tizen HttpClient was implemented. One of the many delights of Xamarin and the “new” Microsoft is that pretty much everything is open source.

I went digging, expecting to find a Tizen HttpClientHandler but to my surprise, I found it was using the .NET Core UNIX HttpClient. The source is here (it uses Curl).

Enabling logging

One final tip. Sometimes you want to see what is happening under the hood. When looking through the source I found logging statements, and I wanted to see the logs, such as this code from the CurlHandler:

      static CurlHandler()
            // curl_global_init call handled by Interop.LibCurl's cctor

            Interop.Http.CurlFeatures features = Interop.Http.GetSupportedFeatures();
            s_supportsSSL = (features & Interop.Http.CurlFeatures.CURL_VERSION_SSL) != 0;
            s_supportsAutomaticDecompression = (features & Interop.Http.CurlFeatures.CURL_VERSION_LIBZ) != 0;
            s_supportsHttp2Multiplexing = (features & Interop.Http.CurlFeatures.CURL_VERSION_HTTP2) != 0 && Interop.Http.GetSupportsHttp2Multiplexing() && !UseSingletonMultiAgent;

            if (NetEventSource.IsEnabled)
                EventSourceTrace($"libcurl: {CurlVersionDescription} {CurlSslVersionDescription} {features}");

To see these log messages I first declared a member _myEventListener member which is an EventListener:

private MyEventListener _myEventListener;

Then later in my code I initialized the _myEventListener:

  var netEventSource = EventSource.GetSources().FirstOrDefault(es => es.Name == "Microsoft-System-Net-Http");
  if (netEventSource != null && _myEventListener == null) {
    _myEventListener = new MyEventListener();
    _myEventListener.EnableEvents(netEventSource, EventLevel.LogAlways);

The event listener is declared like this. Note the filtering of a couple of hard-coded strings that were poluting my output:

class MyEventListener : EventListener {
  protected override void OnEventWritten(EventWrittenEventArgs eventData) {
    var memberNameIndex = eventData.PayloadNames.IndexOf("memberName");

    var memberName = memberNameIndex == -1 ? null : eventData.Payload[memberNameIndex].ToString();

    var message = new StringBuilder();
    for (var i = 0; i < eventData.Payload.Count; i++) {
      if(i == memberNameIndex) continue;
      if (i > 0) {
        message.Append(", ");
      message.Append(eventData.PayloadNames[i] + "=" + eventData.Payload[i]);

    var last = eventData.Payload.Last().ToString();

    if(last == "Ask libcurl to perform any available work...") return;
    if (last == "...done performing work: CURLM_OK") return;
    if(string.IsNullOrWhiteSpace(last)) return;

    if (memberName == null) {
    } else {
      // ReSharper disable once ExplicitCallerInfoArgument
      Log.D(message, memberName, "CurlHandler");

My logger uses Tizen.Log.Debug("viac", message, "", "",0); to output to the log, using the Tizen system Log class.

I used this command line to view the log:

sdb dlog viac:D" or "sdb dlog viac:D`

An extract of the output it all its glory:

D/viac    ( 7582):  18:30:26 []  TizenNetworkImpl MakeHttpRequest Sending...
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync thisOrContextObject=HttpClient#52727599, parameters=(Method: GET, RequestUri: '', Version: 2.0, Content: <null>, Headers:
D/viac    ( 7582): {
D/viac    ( 7582):   Authorization: Bearer ...
D/viac    ( 7582): })
D/viac    ( 7582):  18:30:26 []  CurlHandler .ctor thisOrContextObject=CurlResponseMessage#51192825, parameters=(OK)
D/viac    ( 7582):  18:30:26 []  CurlHandler RequestMessage thisOrContextObject=CurlResponseMessage#51192825, first=CurlResponseMessage#51192825, second=HttpRequestMessage#38539564
D/viac    ( 7582):  18:30:26 []  CurlHandler Content thisOrContextObject=CurlResponseMessage#51192825, first=CurlResponseMessage#51192825, second=NoWriteNoSeekStreamContent#64971671
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync handlerId=26756241, workerId=4, requestId=5, message=Method: GET, RequestUri: '', Version: 2.0, Content: <null>, Headers:
D/viac    ( 7582): {
D/viac    ( 7582):   Authorization: Bearer ...
D/viac    ( 7582): }
D/viac    ( 7582):  18:30:26 []  CurlHandler SendAsync thisOrContextObject=HttpClient#52727599, result=System.Threading.Tasks.Task`1[System.Net.Http.HttpResponseMessage]

Final thoughts

When I first learned to program I spent evening after evening of focused hours trying to break the copy-protection on 8-bit games, not to steal them (I’d already bought them), but to try to disssassemble them in order to work out how to get infinite lives.

I often think that despite the formal training I later received getting a degree in computer science, those childhood hours of fierce focused concentration, trying to accomplish something I wasn’t even sure was possible, was the best training I ever had.

I had no idea whether I could get the Alexa Voice Service running on Tizen, whether I could get HTTP/2 working, or a myriad other things. Sometimes you just have to keep trying, having faith in your abilities, continually trying different approaches, until eventually, one day:

Auto launching Xamarin Mac apps at login

I have an app, called Voice in a Can, which lets you use Alexa on your Apple Watch and iPhone. I’m working on bringing it to the Mac, and one of the things I want is that it be started at login, if the user wants this.

To do this in a sandboxed app, you need to create a helper app, and bundle it inside your main app, in a specific location (/Contents/Library/LoginItems). This helper app is automatically launched at startup, and has no UI – all it does is launch the main app, which in my case sits as an icon in the system toolbar.

There is a great blog post on how to do this by Artur Shamsutdinov, which this post is based on. This blog post adds some detail, information on how to use MSBuild, and trouble-shooting information. You really should check out Artur’s post too.

I created a main application, in my case it is called VoiceInACan.AppleMac:

I made sure this was signed, and configured to use the SandBox.

In my AppDelegate I called SMLoginItemSetEnabled to tell MacOS to launch my helper app at startup (the com.atadore.VoiceInACanForMacLoginHelper is the bundle ID of my helper app, defined below) :

    static extern bool SMLoginItemSetEnabled(IntPtr aId, bool aEnabled);

    public static bool StartAtLogin(bool value) {
      CoreFoundation.CFString id = new CoreFoundation.CFString("com.atadore.VoiceInACanForMacLoginHelper");
      return SMLoginItemSetEnabled(id.Handle, value);

    public override void DidFinishLaunching(NSNotification notification) {
      var worked = StartAtLogin(true);

In a real app you’ll not want to auto-launch a Sandboxed app without permission from the user since your app will be rejected by App Review when you submit it.

I created a helper Mac app, as another project, in my case called VoiceInACan.AppleMacLoginHelper

I made sure this was signed, and configured to use the SandBox

I edited the storyboard to uncheck Is Initial Controller (in the properties on the right) to ensure the helper app has no UI:

I updated Info.plist to indicate the app was background only (because it will have no UI and serve purely to launch my main app on startup):

I added a dependency from my main app to the helper app by right-clicking on References in my main app, selecting Edit References, going to the Projects tab and checking the checkbox next to my helper app:

This ensures that the helper app is built before my main app.

In my AppDelegate.cs in my helper app, I launch my main app:

using System.Linq;
using AppKit;
using Foundation;

namespace AppleMacLoginHelper {
  public class AppDelegate : NSApplicationDelegate {
    public AppDelegate() {

    public override void DidFinishLaunching(NSNotification notification) {
      System.Console.WriteLine("ViacHelper: starting");
      if (!NSWorkspace.SharedWorkspace.RunningApplications.Any(a => a.BundleIdentifier == "com.atadore.VoiceInACanForMac")) {
        System.Console.WriteLine("ViacHelper: Got bundle");
        var path = new NSString(NSBundle.MainBundle.BundlePath)
        var pathToExecutable = path + @"Contents/MacOS/VoiceInACan";
        System.Console.WriteLine("ViacHelper: Got path: " + pathToExecutable);

        if (NSWorkspace.SharedWorkspace.LaunchApplication(pathToExecutable)) {
          System.Console.WriteLine("ViacHelper: Launched: " + pathToExecutable);
        } else {
          System.Console.WriteLine("ViacHelper: Launched: " + path);

      System.Console.WriteLine("ViacHelper: dying");

    public override void WillTerminate(NSNotification notification) {
      // Insert code here to tear down your application

I updated my main app to embed the helper app within it

So far I’ve created two apps: the main app, which provides my main functionality (in my case Alexa), and a helper app which has no functionality other than to launch the main app. In order for the SMLoginItemSetEnabled to work the helper app needs to be embeded within the main app.

To do this, I edited the csproj of my main app, and added markup to embed the main app. Here are the bits, the complete thing is below:

First, define an ItemGroup that references all the files in the helper app’s bundle (the Configuration refers to Debug or Release):

    <HelperApp Include="$(ProjectDir)/../VoiceInACan.AppleMacLoginHelper/bin/$(Configuration)/**" />

Next, copy those files into the right place in the main app (note that this is done after _CopyContentToBundle so that it is copied before the build signs the final bundle):

  <Target Name="CopyHelper" AfterTargets="_CopyContentToBundle">
    <Message Text="Copying helper app" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library/LoginItems" />
    <Copy SourceFiles="@(HelperApp)" DestinationFiles="@(HelperApp->'$(AppBundleDir)/Contents/Library/LoginItems/')" />

Finally, the embeded bundle’s files can be signed (this may not be necessary … first try without this):

  <Target Name="CodeSignHelper" AfterTargets="CopyHelper">
    <Message Text="Signing helper app" />
    <Codesign SessionId="$(BuildSessionId)" ToolExe="$(CodesignExe)" ToolPath="$(CodesignPath)" CodesignAllocate="$(_CodesignAllocate)" Keychain="$(CodesignKeychain)" Resources="$(AppBundleDir)/Contents/Library/LoginItems/" SigningKey="$(_CodeSigningKey)" ExtraArgs="$(CodesignExtraArgs)">

This is my complete modification to my csproj (after the import of the Xamarin.Forms.targets):

  <Import Project="..\packages\Xamarin.Forms.\build\Xamarin.Forms.targets" Condition="Exists('..\packages\Xamarin.Forms.\build\Xamarin.Forms.targets')" />
    <HelperApp Include="$(ProjectDir)/../VoiceInACan.AppleMacLoginHelper/bin/$(Configuration)/**" />
  <Target Name="CopyHelper" AfterTargets="_CopyContentToBundle">
    <Message Text="Copying helper app" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library" />
    <MakeDir Directories="$(AppBundleDir)/Contents/Library/LoginItems" />
    <Copy SourceFiles="@(HelperApp)" DestinationFiles="@(HelperApp->'$(AppBundleDir)/Contents/Library/LoginItems/')" />
   <Target Name="CodeSignHelper" AfterTargets="CopyHelper">
    <Message Text="Signing helper app" />
    <Codesign SessionId="$(BuildSessionId)" ToolExe="$(CodesignExe)" ToolPath="$(CodesignPath)" CodesignAllocate="$(_CodesignAllocate)" Keychain="$(CodesignKeychain)" Resources="$(AppBundleDir)/Contents/Library/LoginItems/" SigningKey="$(_CodeSigningKey)" ExtraArgs="$(CodesignExtraArgs)">


Finally copy your main app’s bundle to the Application folder, and run it so that it registers the embedded helper to start on login.

Troubleshooting SMLoginItemSetEnabled

The first challenge is getting log information. If you run the Console app, it only shows you information from after it was launched, which is after you login. You can get historical information, from the terminal

sudo log collect --last 1d
open system_logs.logarchive

This will show you the last day’s worth of logs. You’ll want to look for messages from otherbsd

The second challenge I faced was that although I registered the startup item properly, it wasn’t being launched properly. I was getting this cryptic error Could not submit LoginItem job com.atadore.VoiceInACanForMacLoginHelper: 119: Service is disabled:

After Googling, I discovered the lsregister command, and was able to see many many “registrations” of my helper app, from developing and backups etc

/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -dump | grep | more

What fixed it for me, and your millage may vary, and you should really check what these commands do before executing them, was:

/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -gc
/System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister -kill

I then re-ran my main app, which re-registered my helper app as a single entry in lsregister and joy, my app launches at startup. I started working on this yesterday at 7:30 am and got it working around 1:30 pm. I’m hoping if you need to do something similar this post will shave a little time off your experience!


There is no way I’d have got this working without Artur Shamsutdinov’s blog post from 2016.

Running Xamarin Forms apps on the new Tizen 4.0 Samsung Galaxy Watch

I picked up a new Samsung Galaxy Watch (SM-R800) today, and after spending an evening on it, I managed to deploy and run a Xamarin Forms (Tizen 4.0) app on it … I just tried the default template:

In case it helps someone else, these are some of the things I did. FWIW I’m using Windows running in Parallels on a Mac.

  1. Install the Tizen tools for Visual Studio, and create a new Tizen XML App (Xamarin Forms)
  2. Enable development mode on the watch by tapping the software version
  3. Enable Wifi on the watch, and note the IP address
  4. Run the Device Manager (Tools|Tizen|Device Manager) and use the Scan button … this should detect your watch (It didn’t initially for me because I’d forgotten to set my Windows network to Private)
  5. Run the Tizen Package Manager (Tools|Tizen) and ensure you have Samsung Certificate Extension installed under Extension SDK
  6. Run the Tizen Certificate Manager (Tools Tizen). Click the “+”. If you don’t see Samsung listed then check the previous step. Choose Samsung and run through all the steps (including signing in with a Samsung account).
  7. This is the part that tripped me up. Under Tools|Options|Tizen ensure you have “Sign the .TPK file…” checkbox checked:
  8. Build and Run (I got a hang running with the debugger, but when I started without debugging it worked.). You should see the watch as the device in Visual Studio:

I’m sure I’ve forgotten something … it was a long night getting this running so feel free to reply and I’ll see if I can help.

Screencast: Your computer screen as an Alexa Smart Home Security Camera

This is a screencast I just put together showing how you can show your computer’s screen as an Alexa Smart Home Security Camera.

I wanted this because I already have security camera software running on a windows desktop … all I wanted was to say “Alexa, show security cameras” and see the software running on that computer.

Source referenced in the screencast is here

Using Siri to control your Alexa Smart Home devices

I have many Smart Home devices that can be controlled from my Amazon Echo, however none of those devices can be controlled from Siri on my Apple Watch or iPhone. None are HomeKit compatible.

What I’ve done lets me control my Alexa Smart Home devices via Siri on my Apple Watch or iPhone. This solution is not elegant (it involves a Raspberry PI, HomeBridge and a speaker) but it does work…

Code here. Demo here: