Tuesday, February 1, 2011

Javascript frameworks explosion

I can't believe how many and how fast JS frameworks are appearing. At the beginning of December I was aware of JQuery, Scriptaculous/Prototype, Dojo, Ext and YUI for mainstream web devel. JQTouch looked the go for iPhone. Sproutcore and Cappucino looked a bit esoteric and seemed to be aimed at full-blown applications in a browser window.

I became aware of the Unify framework in November and was intending to look at it closely during my extended leave over Dec. and Jan. I was also interested in uki.js as an alternative to JQuery.

But it seemed that almost every week in that period I found yet another framework being touted as the best thing since sliced bread.

First I found Coffeescript which is actually an alternative, simplified syntax for Javascript together with a compiler which outputs pure JS. Makes JS really easy to write.

Then as a result of a question on Posterous a whole swag of alternatives were mentioned:
  • Backbone.js (which in turn uses underscore.js) + jQuery or zepto
  • JavaScriptMVC
  • Knockout.js
  • Angular.js
Each of these warrants a decent looksee. Knockout seems to have a lot of fans. Angular seems incredibly simple to me.

Of course there's a few mobile frameworks that need consideration:
  • SenchaTouch (JQTouch rebadged)
  • JQueryMobile
  • Zepto.js
  • xui.js
  • ChocolateChip Mobile
I also saw mention of DHTMLX but have no comment at this stage.

So there appears to be a spectrum of frameworks. At one end are the "app in a web page" frameworks like SproutCore and Cappucino which seem to require large downloads at commencement but then respond almost exactly like a stand-alone app. (A big footprint is not automatically required. uki.js's Mail and Wave demos are very small packages.) In the middle are "a few pages" apps which seems to be the place for Backbone, Knockout and Angular. And then there is the minimalist and/or mobile device end where the framework simply ignores all the baggage web apps have to carry such as IE6 compatibility and takes full advantage of the mobile device browser's handling of HTML5 and CSS3.

The mobile device end of the spectrum is where to aim I think. Eventually all browsers will support HTML5 correctly (maybe even MS will get it right; maybe...). In the meantime there is a huge and ready market for mobile device web apps.

But how does one keep up with this deluge of frameworks? How does one decide which, if any, are a) good quality; b) easy to use; and c) will be around long enough to get some ROI for the effort expended in learning to use the framework? In my years as a developer I've seen so many IT technologies come and go and, perhaps most annoyingly, the best technically is rarely the winner.

RedEye special

I tried everything I knew to get the RedEye mini to work on my iPhone 3GS but it simply wouldn't. Eventually complained to RedEye and they asked if I could try it on another iPhone. My son's 3GS worked perfectly. Seems I have a faulty mike connection in the headphone socket. Too late to get a warranty repair dammit. Never noticed the problem because I was always using the Bluetooth headphones and mike.

Oh well, maybe my son will find the REmini useful...

Saturday, January 15, 2011

Another IR dongle for iPhone

Found the RedEye Mini dongle for iPhone. Advertised at US$49 on their website. Tried to buy it but they don't ship to Oz. Seems they've given distributor rights to some tin-pot company in Oz and they want AUD$95!!, with the exchange rate almost at parity.

So I checked Amazon and found one shop selling it for US$38 but they wanted $35 for postage! Eventually found Amazon's offer which was the same price but only $9 for postage. So it looks like I will get the dongle for the equivalent of US$49 and the Oz sellers can go complain to the govt. about how they are losing sales to the Internet because no GST makes them cheaper (yeah, almost 100% cheaper).

Not sure if this dongle will be as open to program as the L5 but they might make an exception for the VTV. The use of the audio socket...

Oh blast, the use of the audio socket means it will probably disable the microphones in the iPhone. Screwed whichever way I turn: use L5 and lose charging port or use RedEye and lose voice input. Well at least the RedEye will make a good manual TV controller.

VTV

I might have the order of the apps around the wrong way here. Maybe I ought to be hacking the OpenEars sample so that, in addition to displaying the words it recognises, it also outputs the corresponding IR codes to the IR dongle. That's actually a lot easier to implement (I think). No need to install the IR code learning section nor the GUI section.

So devel steps:
  1. Test VTV app with VTV vocab (clone OE sample app). (Especially test headset input.)
  2. Use the L5 app to learn the hex codes for DTV controller and upload them to L5 hexcodes database.
  3. Install the hex codes in VTV app.
  4. Add a "dumb" controller which toggles "on" and "off" when each time voice input is detected. Can test this with my DTV tuner controller (or maybe even Apple FrontRow controller?).
  5. Add IR output for all the vocab.

VoiceTV

A little bit later...

Have finally got around to working a bit on project. Have downloaded, compiled and run the L5 sample app on both the emulator and my iPhone. Have downloaded, compiled and run OpenEars library and sample app on both the emulator and my iPhone. OpenEars is an XCode wrapper around CMU Sphinx.

It looks like the restricted vocab for the controller will allow Sphinx to be recognise words pretty accurately. However I still need to generate and test a "TV controller" word list to replace the sample app's "mobile toy controller" list.

The L5 now uses the Phillips Pronto hex code format ("We are using a modified subset of the Pronto IR Format (http://www.remotecentral.com/features/irdisp1.htm).") This would be fine except the RemoteCentral library is seriously out of date wrt. Sharp Aquos TVs. It looks like I will have to manually load the codes from a controller .

So all I have to do now is find a way to fool the L5 sample app into thinking it's getting its input from its buttons when in fact it's really getting it's input from OpenEars recognised words.

Saturday, August 7, 2010

iPhone app for voice-controlled TV

I was looking to do something useful during my extended sabbatical "between jobs" last year so contacted ability.org.au to see if I could assist with any projects. Their director, Graeme, suggested an iPhone app to allow voice control of a TV. He noted the relatively low-cost of iPhones (compared to typical assistive devices), their ubiquity, the availability of IR dongles for iPhones (nearly all mobile phones prior to iPhone used to include IR capability, how ironic), and the availability of voice-recognition software for the iPhone (specifically Dragon Dictation, Google's voice search and the voice control utility in IOS). So now seemed like a good time to try this. It might even be possible to get some research funding to at least cover costs. (And to be honest, if it works it might be saleable on the App Store for general users, not simply those who can't accurately hit the buttons on a typical TV controller.)

Graeme suggested I look at the L5 Remote dongle for iPhone because they have recently open-sourced the API. I downloaded this package and it does indeed seem usable. The L5 costs US$49.95 (+S&H), about AU$70. Another dongle I looked at is the My TV Remote (originality in naming doesn't seem to be part of the plan). It sells for US$9.99 and plugs into the iPhone audio socket (L5 plugs into docking connector). I emailed the company and although they don't sell to Australia yet, they are considering it. And they are willing to send me their API if I want it. MTR uses audio socket which might make use of headset mike difficult (probably use Bluetooth to bypass). OTOH L5 uses docking connector which might make charging and/or long-term use difficult. (Couldn't find a charging double adapter for dock.) I purchased an L5 and it arrived within a week but it's just sat on my desk while work and life have intervened :-)

My initial hope was to use the recogniser in Google's voice search but so far haven't been able to find an open-source API for it. Dragon Dictation is proprietary and requires licensing which I'd rather avoid if possible.

Last week I discovered the CMU Sphinx project (http://cmusphinx.sourceforge.net/), an open-source voice recognition project. Brian King has made an iPhone Objective-C wrapper available for the pocketsphinx library (http://github.com/KingOfBrian/VocalKit) so I'm currently trying to learn how to use Sphinx.

The project, as I see it, requires the following:
1) A recent XCode and iOS4 SDK installed on my MacBook
2) pocketsphinx library added to XCode's static lib list
3) L5's lib added to XCode's static lib list
4) Some glue code to output IR codes when one of a small list of command words is recognised.

Each of the above steps is a project in itself:
1a) Renew developers subscription with Apple
1b) Download latest XCode and iOS4 SDK
1c) Install
1d) Install/update app signature certificate
1e) Write a test app, compile and test on iPhone emulator
1f) Install and run on iPhone

2a) Download Brian King's iPhone wrapper
2b) Install in XCode as per README
2c) Write and test a "hello world" app
2ca) Do I need a special dictionary or is default dictionary adequate?
2cb) Does Sphinx need training for Australian accent?
2cc) Should I test Sphinx on MacBook first to answer these questions? (Probably yes.)
2d) Modify pocketsphinx output if necessary to ease connect to L5

3a) Download and install L5 Remote app on iPhone
3b) Upload app with test controller's IR codes. (Can use digital tuner remote controller for sampling and testing.)
3c) Verify app works.
3d) Download and install L5 API in XCode
3e) Write and test "Hello world" app
3ea) Specify what a "Hello world" app should do.
3eb) Write, test, debug on iPhone (Can't use emulator for extra hardware lik L5)

4a) Design control program
4aa) GUI is almost non-existent.
4ab) Functionality to copy existing L5 app controls
4b) Code and test on iPhone
4c) Repeat 4a) and 4b) until working :-)

OK, so today we are upto 1e) write and install a test app on my 3GS with iOS4.0.1

More later.

Tuesday, March 3, 2009

My first SMT build

I saw a PIC Logic Tester kit being offered by JayCar which uses SMDs. I built a TTL/CMOS probe in my previous life as a hardware designer in the 70s but hadn't updated to the new lower power technologies. I need a logic probe so this was a good start.

Kit arrived this afternoon and I nearly died when I saw how small the components really are. But under the magnifying lamp I could read their values or markings and the PCB looked a lot clearer. So I fired up my new soldering iron and commenced work.

Six hours later it's finished. Lots of problems encountered and overcome. The worst was when I squeezed too hard on the tweezers and a capacitor shot out and I have no idea where in the room it landed. I spent 15 minutes looking but couldn't find it, so decided to proceed without it, hoping it wasn't too critical. But then I found it under the board itself. 0603 components are the worst for hand assembly.

I also discovered I probably had the iron too cold initially. I started with 350C but moved it to 375C and joins seemed a lot quicker and 'wetter'.

Now I've realised I don't have a circuit to test it on so my next project will be to use the FreeScale sample chip I got a while ago to build a simple counter or whatever and I can check my new logic probe on that.

I was delighted to run the 'blinktest' demo program on my S40C18 and display a 291Hz square wave from one of the pins on my new oscilloscope.