Tuesday, December 10, 2013

The Blog has a new home

This is the final post at this location.  The new location of the blog is www.rajapet.com.  Visit me there!

All of the existing content has been moved over.  This version of the blog will be around for a while, but nothing new will happen here.

Thursday, December 05, 2013

A work around for the files in use bug with HeatDirectory in MSBuild

I have this multi-project solution in Visual Studio 2013 and one of the projects is a Windows Installer project.  It uses WiX (Windows Installer XML) 3.8 and when I rebuild the solution, the final result is a nice .MSI file that will install the executable bits from the other projects.

To get the files that need to be bundled with the installer, I copy the files that I need from the project bin folders to a folder in the WiX project named “files”.  This folder is not part of the project or the solution and is not in source control.  I started out with a prebuild event of the WIX project that did the following:
  1. Delete the files folder. I just assume that everything in the folder is obsolete
  2. Robocopy the deployable files from a WPF project to the files folder.
  3. Robocopy an ASP.Net MVC 4 project to the filles folder
  4. Run ctt.exe (Config Transformation Tool) to clean up the web.config file and set some default values.
  5. Run the WiX Harvest tool, heat.exe, to generate a .wxi include file of all of the files in the files folder.
Using robocopy makes it easy to just the files that you want and not include the files that are not needed for deployment.

With Windows Installer, every object that gets installed has to be defined in a WiX source file.  You get end up with stuff that looks like:

Which is hideous to do by hand.  You can run heat.exe on a folder and it will generate that the include file for all the files in that folder for you.  In my prebuild event, I had the following lines:

The 5th line is the heat command line. The various command line options are documented here. This ran without any problems on my dev machine. Hilarity ensued when I tried to make a build from our TFS server.  I was getting build errors when it executed heat.exe

heat.exe: Access to the path 'C:\Builds\31\VSTancillary\FleetVision_Dev\Sources\WixSetupProject\adminfiles.wxs' is denied.

That was annoying. During the build, heat was recreating the adminfiles.wxs file each time.  Since that file was in source control, it was set to read only on the build server.  That caused heat.exe to abort out since it couldn't recreate that file. Our build engineer suggested using the attrib command to clear the read only bit. The light bulb (LED, should last longer than incandescent) flickered above my head and I realized that since that file was in source control, I didn't need to created it on the build server.  I just needed to set the build so that heat didn't run on the build server.

There are probably a few ways of doing this, I went with setting it up so that heat would only get run for debug builds.  Our build server is only doing release builds, this would work for me. So I moved the prebuild code out of the project property settings and implemented them as individual MSBuild tasks.

The first part of doing that was to install the MSBuild Extension Pack from CodePlex. I did that to get a RoboCopy task for MSBuild.  Robocopy is very powerful tool for copying and synching up files, but has this one little quirk. It return 1 as a success code. Everything else on Planet DOS returns 0 for success and non-zero values to indicate an error.  The MSBuild.ExtensionPack.FileSystem.RoboCopy task knows about that quirk and prevents MSBuild from reporting a robocopy success code as an error.  Lots of good stuff in the Extension Pack, you'll want to have one in your toolbelt.

When you install WIX, you get WIX specific extensions for MSBuild.  The task for heat is called HeatDirectory.  The HeatDirectory equivalent of the heat.exe command line that I was using looks like this:

The first element is Condition, which is comes with MSBuild.  By setting the value to " '$(Configuration)|$(Platform)' == 'Debug|x86' " , MSBuild will only execute that task when the condition evaluates to true.

That worked perfectly, but only for the first time.  After doing one debug build, the next build bombed out on the RoboCopy task.  There was a problem with the files being in use.  If I restarted VS, I could do another build. If I commented out the HeatDirectory task, the build would work.  I went to the WIX site and sure enough, this was a known bug. The heat.exe was keeping the file handles open for the files that it read.

By default, HeatDirectory was running heat.exe from within the Visual Studio process. This was the fast way to execute heat, but you pick up any handle heaks from the heat.exe.  In one of the comments to the bug report, a work around was suggested.  Add RunAsSeparateProcess="true" to HeatDirectory.  This forces heat.exe to be run as a separate process and the leaked handles get flushed when that process ends.

That took care of the problem.  While this is a known bug, the comments associated with that bug made it clear that it's not going toget addressed any time soon.

So what is CTT? It is a command line version of the XDT transform that Visual Studio uses when it transforms web.config from web.release.config and web.debug.config.  It's another good tool.

If you are still reading this, here is the final version of the prebuild events

Monday, October 07, 2013

Taking the Acer W3 from Preview to RTM

It was time to install Windows 8.1 on my Acer Iconia W3-810. I received the tablet at the Build conference. The W3 came with the 32-bit edition of Windows 8 and Microsoft provided a USB drive with preview copies of Windows 8.1 for it and the Surface Pro. I had installed the preview version of Windows 8.1. With the availability of Windows 8.1 RTM on MSDN, I decided to repave the Acer with a fresh 8.1 install

One of the nice touches with the Build giveways was that Microsoft included a 16GB USB3 flash drive. You don't get any faster performance from the USB3 drive on the Acer, but the Surface Pro has a USB3 port.

The first thing I did was download the ISO image. I needed the 32 bit version, the Atom Z2760 processor on the Acer is a 32-bit CPU. Next step was to get the latest drivers for the Acer. Headed over to the Drivers and Manuals page on the Acer site and searched for "W3-810". This brought up all of the files available for the W3.

There is a BIOS update and a Driver Package download. I grabbed them both. I downloaded the BIOS update to the desktop on the Acer and ran it. It rebooted and installed itself with out any issue. I didn't see any release notes with the BIOS update, but the tablet was fine after  installing it. The BIOS upgrade may have been moot, when I installed the Drive Package later on, it did a BIOS update with a newer version.

I chose to repave the W3 and do a clean install. Windows 8.1 does not come with the touch screen drivers for W3. That's why I downloaded them ahead of time. If you do this, get a USB hub and a spare USB mouse and keyboard, this will make life much easier for the upgrade.

There is a hidden folder on the c: partition named c:\oem, make a backup copy of it just in case something goes wrong. That has a bunch of Acer specific files and it should have a copy of the Office 2013 installer.

One more file I needed to get was the Windows 7 USB/DVD Download Tool. Ignore the "Windows 7" part of the title, it works just fine with Windows 8. This tool makes it easy to make a bootable USB drive from an ISO file.

When you download/install/run this tool, it will prompt you for the ISO file and then ask if you want do a USB hard drive or a USB DVD. Pick the hard drive option and let it reformat the drive. It will then make the drive bootable and extract the ISO image to the drive. I used this to make the /Build drive a bootable Win 8.1 boot disk. It turned out that I didn't need to make the drive bootable, but it's good to know to do it.

With the drive now a bootable bucket of 32 bit Windows 8.1 joy, I then copied the contents of the driver package ZIP that I had downloaded from the Acer site to the thumbdrive. I had what I needed to upgrade the Acer. I went into the Acer BIOS and changed the boot order so that it would boot from an external drive first. To get into the BIOS screen, reboot the Acer and press the F2 key on the USB keyboard. You can get into the BIOS by holding the up volume button while pressing the power button. But since you'll need the USB keyboard, you might as well start off with it attached.

There is probably more than one way to do this, and I hope an easier one. To get the Acer to install from the USB, I had to boot into recovery mode. Press and hold the power button and the Windows button as it boots up. That should boot up into the Windows 8.1 Troubleshooter. It will ask you for your language. On the next screen, select the "Troubleshoot" option. On the Troubleshoot screen, select "Advance Options". I then selected "Command line prompt"

This opened up a cmd.exe shell. From here, I switched to the D: drive and ran setup.exe. This started the Windows 8.1 install. From that point on, it was just like installing the OS as a new install on any other machine. When it prompts you for a destination partition, choose the largest one, it should be about 48Gb in size on this model.

After a long while, the Acer booted up into Windows 8.l, RTM. The touch drivers will need to be installed. I went to the Windows Desktop and use Explorer to access the thumbdrive. This is where it is handy to have a keyboard and mouse connected. Then I opened the folder with the Acer drivers and ran the setup.exe. If you do this, you will see a a banner for the Intel Atom Processor drivers. Click through it and go have a sparkling beverage, it''s going to take a while. After it's was done, it will prompted to restart the tablet, on the reboot it flashed the BIOS.

After rebooting, I had touch, but I visual glitches in about half of the Windows Store apps. The text had the same color background as the foreground, making the apps unusable. Fortunately desktop mode was fine. I was able to run Windows Update and that seems to have sorted it all out.

I do need to re-install the Home & Student Edition of Office 2013 that came with the tablet. Installing 8.1 in this way will remove all the installed apps. I wanted to start with a clean slate, so I knew this going in. The installer is supposed to be in the hidden c:\oem folder, but I couldn't find it on mine. I still have the registration key, I'll either find another source for the installer or just install Office Pro.

Now that everything is in place, my last step was to remove the remaining bits of the previous version of Windows. The installer had renamed the c:\windows to c:\windows.old , and that was still there. Microsoft has a page that explains how to this.

  1. Open Disk Cleanup by swiping in from the right edge of the screen, tapping Search (or if you're using a mouse, pointing to the upper-right corner of the screen, moving the mouse pointer down, and then clicking Search), entering Disk Cleanup in the search box, tapping or clicking Settings, and then tapping or clicking Free up disk space by deleting unnecessary files.
  2. If you're prompted to choose a drive, select the drive that you just installed Windows on, and then click OK.
  3. In the Disk Cleanup dialog box, on the Disk Cleanup tab, click Clean up system files.
  4. If you're again prompted to choose a drive, select the drive you just installed Windows on, and then click OK.
  5. Select the Previous Windows installation(s) check box, and any other check boxes for the files you want to delete, and then click OK.
  6. In the message that appears, click Delete files.

And that's how I did a clean install install of Windows 8.1. If you have Windows 8, you shouldn't need to go through of this. The update to 8.1 will be in the Windows Store and will be a free download.

Wednesday, September 25, 2013

Today's lesson in voice mail etiquette

I just had a fun adventure in returning a wrong number voice mail. I arrived into work late today and I had a voice mail message time stamped at 10:00am. The message was the following:
This is Steve F____, number 1163. I need to know what time to come in for the ____ meeting. Please call me right back at XXX-XXXX
I have no idea who Steve is and we don't number our clients, we go by their school district name. We also don't any ____ meetings.

However Tyler is a big enough company to have another Chris Miller. That Chris works for another division a few time zones away. He was hired a few months back and we get emails and meeting invites for each other all the time. I make sure that any communication meant for the Chris Miller is sent to intended destination.

So I went under the assumption that this call may have been for the other Chris Miller. Steve didn't say who he worked for, or where he was calling from, or even the area code. Our voice mail system lets you dial the caller back, so I did that.

When Steve answered, the conversation went more or less like this:
Steve: Hello?
Me: Hi, is this Steve F____?  This is Chris Miller from Tyler Technologies, and I am returning your call from 10:00 this morning
Steve: Who is this?  I didn't call you
Me: You left a voice mail asking me to call XXX-XXXX
Steve: I called someone else at ZZZ-ZZZZ
Me: That is my number (and I repeated his message back to him)
Steve: I was calling ____ at the ____ company, they must have given me the wrong number.
Then we ended the call. If you call my number and get dumped to voice mail, you get the following message:
Hi, you have reached Chris Miller at Tyler Techologies. I'm sorry I am not able to take your call right now, but if you leave your name and phone number, I'll return your call...
If you are going to leave a message, please listen to the outgoing message to make sure that you have called the right person and/or company. Also, you need to leave your area code. We have customers across the country and in Canada, I do need your area code if you want me to call back.

Friday, August 23, 2013

Increasing the target area of a table's DetailDisclose button with Xamarin.iOS

Can you make that button bigger? I went slightly under the hood in Xamarin.iOS to make an accessory button target area larger.

I'm finishing up an iPad app for our fall release schedule. One of the comments that came up in testing was that it was too hard for some people to hit the detail disclosure button in the tables. I was asked to make the button larger.

What we are talking about is the standard blue circle with a white ">" in the center. Something like this:


The target area that responds to finger touches is about 44x44 pixels, centered around the button. We wanted to increase that target area to make it easier to reach. My first pass was to create new button that was a larger size. That worked, but it didn't look like the standard detail disclosure button (actually, it looked hideous, but that's another story). I prefer to use the standard iOS imagery unless I have a good reason not to. If you have spent any amount of time with an iPhone or an iPad, you have seen that button before and you know that touching it will provide you with more details. The more familiar a UI is, the less training the user will need.
On a side note: If you want to create a custom button and want to use imagery that will scale between Retina and non-Retina displays, take a look at PaintCode. I was able to take my vector based image and convert it to code and created a button that drew the imagery at runtime at the device resolution. I don't often need that tool, but I'm glad I have it.

I decided to take another approach. Instead of changing the image, I would make the target area around the button wider. The button will look the same, but clicking more of the white area around that button would trigger the button. And it wasn't hard to do.

We are going to looking at some Xamarin.iOS C# code. This should also work with Objective-C, just with different syntax.

I started out with a custom table cell

public class MyCustomCell : UITableViewCell
{
 public UIButton detailDisclosureButton;

 public MyCustomCell (string Action, string Status, NSString identKey) : base (UITableViewCellStyle.Subtitle, identKey)
 {
  detailDisclosureButton = UIButton.FromType (UIButtonType.DetailDisclosure);
  detailDisclosureButton.Frame = new RectangleF (0f, 0f, 120f, 44f);
  detailDisclosureButton.ImageEdgeInsets = new UIEdgeInsets (0f, 120f - 44f, 0f, 0f);

  AccessoryView = detailDisclosureButton;

  UpdateCell (Action, Status);
 }

 public void UpdateCell(string Action, string Status)
 {
  TextLabel.Text = Action;

  DetailTextLabel.Text = Status;
 }
}

The constructor creates a new SubTitle cell. I then add a UIButton of type DetailDisclosure. The button then gets a new Frame that is much wider, 120px, than the default size. That will give the user a wider target to hit. Since the default is to left align the image, we need to shift the image inside the frame. For that we add a new UIImageEdgeSets, to set the margin on the drawing rectangle for the image. By setting the left inset to the new width minus the old width, we align the image to the right side of the frame.

The last thing to do is to set the AccessoryView property of the cell to the new button, that will let the default code draw the button in the right place and we do not need to know how wide the cell is at runtime. What you lose is the AccessoryButtonTapped event is no longer called, we have to handle that functionality.

The next step would be define the table. I'm using a UITableViewSource and I override the GetCell method to create the custom cell. In addition to creating the custom cell, I needed to assign a TouchUpInside event to the new button as the default accessory behavior will no longer handle that button.

public class MyCustomTableSource : UITableViewSource
{
 protected List tableItems = new List();
 protected string cellIdentifier = "myTableCell";

 public override UITableViewCell GetCell (UITableView tableView, NSIndexPath indexPath)
 {
  MyCustomCell cell = (MyCustomCell)(tableView.DequeueReusableCell (cellIdentifier));

  var Action = SomeMethodToReturnTitle(tableItems [indexPath.Row]);
  var Status = SomeMethodToReturnSubTitle(tableItems [indexPath.Row]);

  if (cell == null) {
   // Create our custom cell
   cell = new MyCustomCell(Action, Status, new NSString (cellIdentifier));

   // Set the tag property of the button to the current row.
   // If you are doing sections and rows, then I would subclass the UIButton and
   // add Section and Row properties to make the code easier to work with
   cell.detailDisclosureButton.Tag = indexPath.Row;

   // Set the TouchUpInside event to check the button's tag property to figure out
   // which row triggered the event.  And then do something
   cell.detailDisclosureButton.TouchUpInside += delegate(object sender, EventArgs e) {
    EditAction (tableItems[(sender as UIButton).Tag]);
   };
  }
  else {
   cell.UpdateCell(Action, Status);
  }

  return cell;
 }

 public void EditAction(MyCustomItem thisAction)
 {
  // Do something here
 }
}

With GetCell, we create or update an instance of our custom cell. When the cell gets created, I assign the table row to the Tag property of the button. Since we are not using the default code for adding a DetailDisclosure button, we have to handle the touch event. Right after the cell is created, the TouchUpInside event of the custom button gets a new delegate assign to it. I cast the sender as the UIButton and access the Tag property to get the selected row.

My code only had a single section in the table, so I only needed to track the table row. If I had needed to know the current section, I would have created a UIButton descendant and added Section and Row properties. That would make the code a little cleaner. The end result is the user gets the a button with the default look and feel, but with a larger target area.

Monday, July 01, 2013

Getting around 0x80073cfb errors after installing Windows 8.1 Preview

As an attendee of the //build/ conference, I received a Surface Pro and an Acer Iconia W3, along with a USB drive with the Windows 8.1 Preview update for each device. After installing 8.1, Windows Store on the Surface Pro showed that there a bunch of apps that needed updates. So I updated them all and it failed to update five of the apps:
  • Reader
  • Windows Alarms
  • Windows Calculator
  • Windows Reading List
  • Windows Sound Recorder
Under the app name, there was the message "This app wasn't installed - view details.". When you click on the message, a dialog would come up with the following message:

Something happened and this app couldn't be installed. Please try again. Error code: 0x80073cb
You got two buttons, "Try again" and "Cancel install".

Each one reported reported the same error code 0x80073cfb. That error code is defined as ERROR_PACKAGE_ALREADY_EXISTS (see here) and it basically means that something is wrong with the app signature. It kind of matches, but not exactly and Windows doesn't know how to proceed.

"We had to destroy the village in order to save it."
The simplest way around this is to uninstall the version that is already on the machine and then install it again from the Windows Store. It's annoying, but doesn't take that long. I did it the following way:


  1. Run the Windows Store app and have it try to update the apps
  2. Get the errors and while on the "Installing apps" screen, then press the Win key to get back the Metro screen
  3. Start typing the app name.  When the app appears, do a long press to get the options menu and select uninstall.
  4. Go back to the Windows Store app and click on the app that you deleted.  You'll see the "Something happened" dialog, click the "Try again" button.
  5. The app should install without any problems now.


I would love to find a way to script this with Power Shell. What would be nice would be if the Windows Store app would give you an uninstall/reinstall option when it encounters 0x80073cfb errors. I'm not the only person seeing this and it's happening with the Acer, just with a different set of apps. On Acer, it gets that error with these apps:


  • Windows Alarms
  • Bing Food & Drink
  • Help & Tips
  • Bing Health & Fitness
  • Windows Reading List


There is some overlap, but only with 2 of the apps. With the Acer, I found that if I burned Alarms, Help, and Reading List, the Bing apps installed normally. I also had the same problem with the HP Envy 23 that I bought last December. I had to burn the village to save the village on that machine too.


The burning village image comes from The Witcher 3 Set, from the Flickr feed of Gameranx.

Friday, May 24, 2013

Certified Xamarin Mobile Developer

I just received the email notification from Xamarin that I successfully passed their Mobile Developer certification exam. I was given a free pass for the exam for attending the Xamarin Evolve Developer conference back in April.

It was a tough exam, it went pretty deep on some of the internals and covered a lot of iOS and Android territory. Passing grade was 50% correct of 100 questions. I hit 90% and the exam software showed me what the correct answers were for what I got wrong. Does it make me a better developer? No, but it shows that I have the knowledge and skill set to create iOS and Android apps using C# with Xamarin technology.

More importantly, I am using the skills that I learned at Evolve while I work on a new app for work.  I really like how Xamarin had put together the training materials for sessions. Instead of the usual slide deck and sample code that you get a conference training session, Xamarin provided chapters of an training manual that covered each session. Plus the source code. You still get the source code.

Yesterday, I went from zero experience with SQLite to being able to create, populate, and use a SQLite database with the sqlite-net ORM code in just a couple of hours. My only reference was the materials from the Evolve session on cross platform data handling. It's been hugely productive to be able to use C# and the .NET Framework on the mobile side as well as the server side.
As a side note, the SQLite Database Browser is great open source tool for checking the contents of a database created in the iPhone/iPad simulator. It's also useful testing SQL queries before you compile them into your code.
Evolve had two parallel training tracks going on at the same time. I was taking the Fundamentals track to make sure that I had the foundation of Xamarin coding down. Now I can review the Advance training materials and move up to the next level.

Wednesday, May 15, 2013

There should be a Blue Rhino for Electric Cars

What if the car companies came together and come up with a standard for replaceable electric car batteries?  It takes hours to charge an electric car battery.  That just pretty much limits the use of an all electric car to local use for small amounts of time. It also limits the number of vehicles that could be charged at any time.

I’m starting to see charging stations around here where you can let your car charge up during the day.  But it doesn’t help if you do errands all day and you are not parked in any single location.  At some point, you’ll have to call it day and head home to your own charging station.

What if you could get around the charging period by using swappable batteries? You could go a gas station and they would swap out your depleted battery with one with a full charge.  It would be like the “Blue Rhino” model for swapping gas grill tanks.  That’s where you bring your propane tank in the hardware store, the grocery store, etc.  And they give you a different one already filled.  You are in and out in a few minutes and the store can service many more customers at peak times.  The Blue Rhino people come in and swap the tanks with fresh ones.  It’s a good model and works pretty well.

For electric car batteries, this is not a trivial problem to solve.  The biggest one is that every car has a different layout for their batteries and they are not designed to be swappable.  This would require major design changes for the car companies.  Plus you would have to have the infrastructure to support a network of batteries.  There would have to be away to easily remove the exiting battery pack and attach the new one.  The filling stations could recharge the batteries overnight when the rates are lower. 

If the car companies could come to a standard for the charging port (which avoided the betamaxing of the electric car market), they could come up with a standard battery pack. You could have a mix of replaceable batteries, plus a fixed set that was optimized for that car. You don’t need to have a full set of batteries to be swapped, just one large enough to go 50 miles or so. That would make it easier for car designers. They can design a rack that provides easy access via the trunk or a car, but have the rest of the car to place the other batteries.

There should be a smart network for locating battery packs.  When your battery is low, the car could check for the closest filling station or car dealership that had full batteries in stock.  This would take some of the fear out of running out of battery power while you are out and about.  You have a smart phone app or web page show the closest station, or just add a 3G/4G radio to the car.

Someone should design a smart rack for the battery packs.  This rack could charge all the battery packs, eliminating the need for someone to swap charging cables.  It could also report it’s status and location to the smart battery network in real time.  When your car told you that fresh batteries where at the Mobil station 2 miles down the road, that information would be current and accurate.  You could also bring fresh smart racks to locations that were running low during a busy day.

This also helps with another issue that electric car owners deal with: rechargeable battery packs have a finite life span. Lithium Ion batteries typically last 3 years. By swapping batteries out, you are not stuck with a battery that’s at the end of it’s life cycle. You can have government subsidize the cost of replacement batteries or factor it in as part of the cost when you swap the batteries. Either way, you avoid financial hit at the 3 year mark.

If we could do this, I think we would see more electric cars on the road.

Monday, May 13, 2013

I git it now

Sharing code across OS X and Windows was a bit more challenging than what I had expected. We have our own TFS servers, but Xamarin Studio on the Mac really can't do much with them. XS does support git, so I have been using a local repository on the Macbook to version control the source code.  I needed to have that source backed up in a sane mannor.

Local TFS servers do not support git (yet). So to get the git repo into TFS, I need a transfer station of sorts. My other work development box is a Windows 7 machine with access to our TFS server. The fun part is getting the code from OS X to Windows without having to do a bulk copy each time.

The first thing I looked at was gitstack. Gitstack is git server that you can use to push your local repo up to. I spent a few hours, but I could not get the Macbook to push the repo up to it. I'm sure it works somehow, but my basic ignorance of how git works is probably a factor. I don't want to have to think to use version control. Tools should not get in the way of the development process.

While local TFS doesn't support git, TFS in the cloud does. I went in and created an account. I then created a new project by clicking the "New Team Project +Git" button. This will bring up a dialog that will let you create a new tem project and specify git as the version control.

By default TFS uses a Windows Live account for credentials. You can add a secondary set of credentials so that you can pass in a user name/ password to authenticate. I found it non-intuitive to find that option a second time. Follow these steps to create (or edit) a second set of credentials for your account

  1. Login into your TFS account.
  2. In the upper right corner of the screen, you will see a gear icon. Click that gear to go to the control panel
  3. In the upper right corner of the screen, you will see your name or email address and a drop down chevron. Click the chevron and select "My Profile" when the dialog opens up.
  4. The "User Profile" dialog will appear. You can change your avatar and display name here.
  5. Click on the link labeled "CREDENTIALS". This will switch to the Alternate Credentials tab on the dialog.
  6. Now you can enter in the secondary credentials. The user name must be alpha numeric only, you can't use an email address.
  7. Click "Save Changes" to save the new credentials.

Now you can use those new credentials when pushing or pulling changes from git. On the Mac, you store those credentials in the OSX keychain so that you will not be prompted each time. I found that the osxkeychain helper that was installed with the OSX version of git to be completely broken. I manually installed a newer copy based on the instructions posted here.

On the OS X side, I had installed git and it was on the search path. I opened up Terminal and in my source code folder, I used git to clone the TFS repo using the following syntax:

git clone https://myteamname.visualstudio.com/DefaultCollection/_git/myproject

This created the folder for the project with all of the git bindings. From within Xamarin Studio, I was able to perform local commits and that worked just fine. I tried to do a push from within Xamarin, but it failed because I had different local git credentials than I did for TFS. I could not find anyway from with Xamarin Studio to specify the git credentials. I've posted a question about that in the Xamarin forums, I'm hoping it's something simple.

But I can push and fetch from the command line, so I just created a bash script file and I run that to synch with the remote report. There are some OS X GUI clients for git, Harry Wolff reviewed some of them here. Right now, I'm going to stick to the command line until I grok git. At the end of the day, I have what I wanted: a local git repo on the dev boxes, with a master repo in the cloud.

Thursday, May 09, 2013

Getting Hyper-V to work on a HP Envy23


I finally got Hyper-V working on our HP Envy 23.  For Christmas, I had bought a new PC for our family.  We have a shared PC in a our family room that everyone uses for email, browsing, etc.  At the time I was working on a Windows Phone 8 project and I needed a machine that could handle that development.  My own PC was running Windows 7 and you need Windows 8 or better for Windows Phone development.

Actually, you need better than Windows 8 for effective Windows Phone development.  To run the Windows Phone 8 emulator, you need to have Hyper-V installed, which requires Windows 8 Professional and a machine with the virtualization enabled in the chipset.

We wanted an all-in-one PC.  It’s in a shared family space and a AIO will take up less room and generally look nicer.  Dell had some interesting models, but you couldn’t get one with Windows 8 Pro, just Windows 8.

With HP, you could get a machine with Windows 8 Pro.  So I ordered an Envy 23 with an i5, 6GB of RAM, and Windows 8 Pro.  It is a nice machine with a good 23” touchscreen.  We went from a huge mess of power cables, USB cables, and assorted wires, down to just a power cable and an Ethernet cable.

HP Envy 23

As a side note, while this machine has decent WiFi built in.  I prefer that ancient Ethernet technology.  Our house has so many devices using WiFi, anything networkable that's not mobile goes on Ethernet.  I had a few rooms wired for CAT-5e years ago and I use Powerline adapters where the cables don't reach.

As typical of a new PC designed for home use, hardware virtualization was not enabled out of the box. I had to go into the BIOS screen and after a bit of searching, I found the virtualization setting under "Security".  I don't know why they put it there, but that's where it was.  So I turned it on and booted up into Windows.  Since Hyper-V is not typically installed on a new machine, I had to install it.  Pretty easy to do and took less time than trying to find the virtualization setting in the BIOS.

I rebooted the PC and it hung on the loading Windows screen.  Turned off the virtualization setting and it rebooted just fine.  Tried uninstalling and reinstalling Hyper-V, didn't fix it.  After trying multiple combinations, it was obvious that virtualization and Hyper-V didn't work.  I called HP support and they said that Hyper-V was supported on this hardware and that either I had installed something that conflicted with Hyper-V or I had a hardware fault.

I didn't agree with either assessment, but I had to follow along with HP's support.  When I bought the machine, I had bought 3 years of priority support.  I usually don't bother with extended support, but it was the cheapest way to buy this machine.  They shipped out a new machine and a week later it arrived.  Fired up the new machine and enabled Hyper-V and virtualization.  Same problem.  That both ruled out a machine specific hardware fault and ruled out the installation of another app being the root cause.

At this point, I just wrote off the problem and sent back the new machine.  Other than the Hyper-V problem, the rest of my family was very happy with the Envy 23.  I ended up building a new machine from scratch that happily runs Hyper-V, so I no longer had the pressing need for Hyper-V on the this machine.

But not being able to run Hyper-V on a machine with the CPU and OS that clearly support virtualization bugged me.  Last night, I did a quick search on "Envy23" and "Hyper-V" and saw a few hits.  I was not the only one with this problem.  The first match was on "Enabling Hyper-V and restarting results in a hung system...".  It was posted by another developer with a similar Envy 23 machine and was seeing the same problem.

Someone had responded to that message that he had fixed the same problem on his HP laptop by updating the Bluetooth adapter's driver.  He had included a link to another message thread in a HP forum with details about the version and where to get the file.

The problem seemed to be with the Ralink Bluetooth 4.0 Adapter.  Various people had version 9.2.10.6 of the driver installed.  When they installed to version 9.2.10.10, the problem went away.  That sounded like an option worth pursuing.

So I decided to roll the polyhedral dice and try installing that driver.  Now normally, I'm not a fan of installing hardware drivers unless I know that they are specifically for the hardware that I own.  It's a commodity part and most likely uses a driver for a family of related parts. Also, I back up my machines to a Windows Home Server box.  The worst thing that happens is that I hose the machine and have to do a bare metal restore.

I downloaded the driver and starting installing it. As part of the install, it uninstalled the previous version and I had an "uh oh" moment.  My mouse and keyboard connect over Bluetooth, updating the driver could affect them.  Fortunately, they worked through the process.  After installing the update, I rebooted the PC and everything seemed to work just fine.  The mouse and keyboard still did mouse and keyboardy things, so I knew that Bluetooth was still operational.

I rebooted one more time and enabled virtualization in the BIOS and booted up the machine.  This was the moment of truth.  The PC booted up normally and I was able to verify that Hyper-V was installed and functioning normally.

So, that brings us back to why it failed with the original driver installed.  I don't know why a Bluetooth driver would hose the operating system when Hyper-V was enabled.  That is so random, it's not something that I would have considered as a the root cause.  From reading the messages on the HP forum, it looks like someone had reinstalled the OS and had needed to download the Bluetooth driver.  When they installed the latest version available, they were able to boot with Hyper-V and made the connection that the Ralink driver was the root cause.

Monday, May 06, 2013

My journey into the Center of Gravity

Last Friday I was given a tour of the Center of Gravity (COG).  What exactly (and where exactly) is the Center of Gravity?  It’s full name is the Tech Valley Center of Gravity and it is a community of technical and artistic creators, makers if you will.  They have a permanent makerspace in downtown Troy, NY and their grand opening is today.

I was lucky enough to get a tour on Friday from one of directors of board at COG, Laban Coblentz.  While their location is new, they have managed to collect a fair of equipment already. There is a lot equipment that can be used now.  From old time drill presses to laser cutters, from soldering stations to a bio lab. They are off to a good start.

 

You can sign up for free as an associate member, that will get you a membership card and the opportunity to purchase day passes to access the equipment in their makerspace.  If you plan on doing a lot of building and tinkering, then you’ll want to step up a full membership at $60/month) or “Super User” at $100/month.  The paid membership gives you full access to the makerspace and includes safety training.

If you need to fabricate some one off parts for a project, they have a couple of 3D printers.  In a couple of years, you’ll be able to buy one a Walmart, but right now it’s hard to get access to one around here.  When I did my robotics project last fall, I could have used a 3D printer to make the mounting pieces.

For additional pictures of the COG, the All Over Albany blog posted a large set on their site.

Tuesday, April 23, 2013

Notes on attending Xamarin Evolve 2013

Last week I attended the Xamarin Evolve 2013 conference in Austin. It was, hands down, the best conference that I have ever attended. It was divided up into two days of training, plus two more days of conference sessions. I attended all four days and the training days alone were worth the price of admission.

I’ve been using Xamarin.iOS for about 5 weeks now. I’m working on an iPad prototype for a companion app for one of our existing desktop apps. I’ve been amazed over how well the Xamarin tools worked and I’ve been able to get a nice app up and running. I had just enough exposure to Xamarin to be able to appreciate the training that I received.

They broke the training into two tracks, Fundamentals and Advanced. I had enough entry level experience with Xamarin.iOS that I wanted to mix and match the sessions, but the training rooms filled up to capacity so I stayed in the Fundamental track. The session instructors were top notch and they had Xamarin TA’s floating around the room to keep everyone on track. Of the code examples, there was less typing of code and more uncommenting out of blocks, but that was OK. On the track that I was on, this was new ground for many people and have of the experience was learning how the tools worked and how iOS design patterns worked.

One smart thing that Xamarin did was to send out the course material a week before the conference. The sessions made extensive use of the sample projects and it saved a lot of time to have that preinstalled. Instead of including the slide deck from the session, they included what looked like chapters from a training manual. hat is much more useful than a slide deck.

During the conference keynote, Xamarin CEO (and co-founder) Nat Friedman had two surprises for us. The first one was a Xamarin native iOS designer. This will free us from the horror of Interface Builder. It only works with storyboards, but has the ability to work with custom components and saves us from the general head scratching weirdness of Interface Builder.

The other surprise was big. Xamarin Test Cloud is a cloud based testing platform that lets you test your apps on hundreds of mobile devices. With 1500+ devices, the Android platform has become increasingly fragmented. You have to deal with multiple carriers, multiple screen resolutions, and multiple versions of Android. Test Cloud will let you upload your app to their site and select which mobile devices to test it on. 

It has a UI scripting language so that you can test your app the way a user would do it. You will get a report of which devices passed and which ones failed. You get screen shots of each step, that allows you to visually verify that the correct results would be on the device. 

In addition to testing UI interactions, you get machine profiling as well. You can see memory usage, CPU usage, and response time. This is a game changer, no one else has this on the market. If I were still doing Android coding on the native java tools, I would switch to Xamarin just for the Test Cloud usage. No one knew how much this will cost, but Sourcegear’s Eric Sink summed it up best (and I am paraphrasing): “It’s going to cheaper than what I would pay a QA guy to do all of that manually.”

The hardest part of the conference days was picking which session to attend. I’m a big fan on Jon Dick’s ZXing.Net.Mobile scanning library, but his session was scheduled at the same time as Nic Wise’s MonoTouch Dialog session. I needed help with MT.D more, so it was off to that session. Xamarin filmed all of the sessions, so at some point I’ll be able to see the sessions I missed.

Another great session was Stuart Lodge’s session on MvvmCross. The MvvmCross library provides XAML-like binding to Android and iOS, and allows you to create cross platform apps using the MVVM design pattern. If you are supporting iOS, Androids, and Windows 8/Windows Phone, you really want to look at this library.

Wally McClure did a session on mapping that I liked.  I like the way he does his presentation. He went around meeting the session attendees before he started, that was a nice personal touch.

I did miss the comedy show on Tuesday, Xamarin had set up a mini-hackathon and I lost track of the time while doing that.

Tuesday night, Xamarin rented the park across the street from the hotel and had a series of food trucks providing the best that Austin had to offer. The giant sized Jenga was pretty popular.

I got to meet and talk with Miguel de Icaza.  He was funny and very bright.  He asked what I was working on and what I thought of the training.

Wednesday had a great UI session by Josh Clark.  Called “Buttons are a Hack”, it was about how to use touch interfaces to design beautiful apps that anyone could use by just picking up the device and exploring.  Any session that starts out with a clip from “This is Spinal Tap” is a winner.

One of the cooler things that Xamarin did was to provide 30 minute sessions with one of their engineers.  That time was available for questions and code review.  I set up my session a week before the conference and one of their support reps had me sent in some screenshots and a list of questions.

When it was time for my session, I was able to get all of my questions answered and I was able to verify that I was using some of the iOS internal objects correctly.  I was pretty sure I had it right, but it was nice to have the code validated.

Since we were in Austin, a bunch of us went out one evening to see the bats at the Congress Avenue bridge.  About a million bats nest under this bridge and they all leave at dusk to go feeding.  It was incredible to watch them all fly out.  There were so many, that from a distance they looked like a cloud,

Evolve 2013 was a great experience.  I learned a lot and made some new friends.  I’m looking forward to Evolve 2014.

postscript:
More Congress Ave Bat pictures can be found here.

Sunday, March 17, 2013

Why was microphone button on the iPad virtual keyboard disabled?

After demoing some voice dictation with an iPad app that I am working, the microphone button stopped working.  It was fine for one demo, disabled for the next one.  I could not figure out what had changed.  I kept seeing numerous references for setting the dictation option and making sure that the iPad had a network connection.  Stuff like this forum posting on discussions.apple.com:

It should be in Settings > General > Keyboard > Dictation, and will only then appear on the keyboard (to the left of the spacebar) if you are onlines - is that where you are looking in Settings ?

I had a good WiFi connection and I didn’t have a Settings > General > Keyboard > Dictation option.  I tried rebooting the iPad, no change.  The strange thing was that it working at 2pm and not at 5pm and no changes had been made to the iPad.  We have a pool of iPads that we sign out for testing, this one is the 3rd generation iPad and I have removed all of the apps that other people had installed.  The iPad has iOS 6.1.2 installed.

I saw the Bluetooth icon on the status bar.  It was not lit up, so it wasn’t connected to any devices.  So I went into the Bluetooth settings and saw this:

image

I don’t have ZAGG Keyboard, but someone else must have paired this iPad to one.  In between the demos, I had borrowed another iPad that was in a ZAGG keyboard case.  It must have been in close enough proximity to the one that I was using so that my iPad picked up the other keyboard.  I turned off Bluetooth and the microphone button was back on the iPad keyboard.

That makes no sense.  The iPad was not connected to the ZAGG Keyboard, but it still disabled the microphone button on the onscreen keyboard.  I could not reproduce this with another iPad running iOS5 and a Logitech keyboard.  The microphone button stayed enabled with the keyboard on and off.  One thing comes to mind.  Apple and the keyboard manufactures need to come up with a keyboard button definition for the microphone.  If you are using an external keyboard, it’s annoying to have to bring up the onscreen keyboard to voice dictation.

Sunday, February 10, 2013

Today I learned that my daughter has never seen a 3.5” disk

I was watching my 10 year old daughter work on a story in MS Word 2007. To save the document, she was going into the Word menu to select save. I asked her why she just didn’t click the save button. She said “What save button?”. I pointed out the save button that is just above the Home tab.

Word 2007 Save

She thought that was the print button. She had no idea that the save button glyph represented a 3.5” disk. She’s never seen one. I have some around, but it’s been years since I have actually used one of those guys.

I dug out an old 3.5 disk and gave it to my daughter. She had no idea what is was.  And was not impressed when I explained to her what it was.

blue floppy

The 3.5” disk had a good run, but it’s been 15 years since Apple came out with the first iMac without a floppy drive. It took a few years for the rest of the industry to catch up, but now the 3.5” disk is deader than a dot matrix printer. Which is another piece of computer technology that my daughter has never seen. 

For the last 20+ years, it has been common place to use a disk icon for the save button on toolbars and menus. I don’t think twice about it being obsolete technology, I just “know” that a disk image on a button means save. But for a 10 year old, she doesn’t have that frame of reference to identify the functio.  From her viewpoint, the closest match is a printer, their icons can look a bit like the disk icon.

The problem is that we really don’t have anything now to replace that image. What other object represents saving a file, and can be drawn in a 16x16 matrix? We’ll be carrying around that image for years after the last disk falls part. 

We carry around other baggage like that. It’s like dialing up someone on your cell phone. I can’t remember the last time I used a phone that had a dial on it. But the terminology is so well established, we just associate dialing with the act of placing a call.  It’s time to come up with some new visual metaphors.

Wednesday, January 16, 2013

A quick photobook order on Shutterfly

A $20 coupon code from Shutterfly showed up in my inbox a week or so ago.  I could spend it any which I wanted on Shutterfly, that was pretty decent of them.  And of course I kept putting off using it.  Yesterday, I get another email telling me that I have one day left to use it.
So I decided to do a picture book.  I went into Picasa and selected about 150 or so pictures from the last 12 months and uploaded them to my Shutterfly account.  And then procrastinated until 10pm tonight.  It took abut 90 minutes, but I was able to make a family "year in review" book for 2012.  After placing the order, they  generated the code to display the book as a widget on this blog.  Which is what you see below.


Shutterfly allowed me to customize my photo book just the way I wanted.
I like using Shutterfly.  It's one of those services that just plain works and I can upload my photos directly from Picasa without having to think about it.