Sunday, March 25, 2012
"Dunboyne Fun Run" lives up to its name
I was pleased with my time -37 minutes and 6 seconds for a 4 mile route.
After the race was over there was even a bonus of a magnificent free spread of cakes and tea for all participants in the local community centre. They even had a band playing live traditional music. Definitely a day to be happy I live in Ireland.
| Reactions: |
Saturday, March 24, 2012
Gathering of amateur and professional weather fans shows Dublin City of Science 2012 at its best
I suppose that all Irish people are interested in collecting weather data. The most common conversation opener for phone conversations in Ireland is "what is the weather like where you are?". Now that professional quality weather monitoring equipment has become available and affordable, we can now turn these informal weather observations into proper scientific measurements which can be used to improve forecasting. I met several people who had their own weather stations in their back gardens like me. I also found out that there is even a site dedicated to collecting this data from Irish amateur weather monitoring stations. I plan to connect my existing monitoring station to this site within the next few days.
This is an excellent example of the type of event which is being organised as part of Dublin City of Science 2012. It encouraged my existing interest in meteorology and I promptly joined the Irish Met Society and I have every intention of being an active member.
| Reactions: |
Sunday, March 11, 2012
Fingal Library Service add eBooks to their range of offerings
The library is really great service and amazingly it is free. People who complain about the level of taxes we pay should at least ensure that they take advantage of the services that these taxes fund. As well as providing a large selection of books available for loan, they also have a large selection of DVDs, Music CDs and audio-books to choose from. They even provide a facility to borrow paintings for a period of up to 3 months (although I have never availed of this service myself).
Some people dismiss the library as an institution from the past which is doomed to fade away in the internet age. I disagree very strongly with this view and I feel that the concept of a public lending library is an important one that we ought not to forget in the debate about the balance between the rights of creators and consumers of content. It is important that we maintain public lending libraries if we want to ensure that everyone in society has an affordable opportunity to access the educational materials they need. In fact the free internet access in the Library can be a vital service to people who might otherwise have access.
Although I am a big fan of technology, many people are surprised to find that I have not yet embraced eBooks. I thought I would use an Android tablet that I received as a present for reading eBooks. Although I initially read a few eBooks on the device, I found myself in the situation where I could easily gain free access to paper books, but I had to pay substantial costs to have the slightly less convenient experience of reading the electronic version those books.
Therefore I was very excited to learn a few weeks ago that Fingal library had added eBooks to their range of offerings. To access the eBook lending service you go to their eBook portal and sign in with your library card number and PIN number (this is the same PIN number you use to log into the regular library web page). If you don't have a PIN number then you can get one by calling in to your local library.
When you start using this service you could be initially a bit confused because this is not a web site run by Fingal Library service as such, instead you are accessing a global web site OverDrive but the cost of your access is being covered by Fingal county council. I am not sure exactly how the financial details work, but luckily I don't need to. I suspect that the books on offer to me are a selection bought by Fingal library service rather than a global selection because there seems to be an unusually large number of books with themes of Irish interest. The service is quite easy to use and any problems I did encounter were easily solved with assistance provided by the operators of the Fingal Library page on Facebook.
The eBooks in the library are available in two different formats. One format is compatible with Adobe Digital Editions which is usable on your computer. The other format is compatible with the OverDrive reader application which is available for a wide variety of mobile platforms. I used the OverDrive Android version on a miScroll tablet and found the reading experience to be very pleasant. I also installed the application on my phone - while the software worked perfectly, I can't imagine I would read an entire novel on the small screen of my phone.
There is a version of the OverDrive application which works on Amazon Kindle devices, but apparently the OverDrive site has placed restrictions which stop books borrowed from libraries outside the USA from being read on the Kindle. I can't see the logic behind this unfair regional restriction, but I guess I would get more worked up about it if I actually owned a Kindle.
When you borrow a book from the library you are free to read it on any compatible device you own, but you can't simply transfer the file from one device to another (which I initially tried to do). Instead you must download the book directly from the web site by selecting the "Get Books" option from within the application. I am not sure why this is necessary, but I assume it is something to do with ensuring you are not trying to bypass the usage limitations.
Each user is allowed to borrow up to three books at a time. When you borrow a book you can choose between a borrowing period of 14 days or 21 days. If you choose the OverDrive format you can return the book as soon as you are finished reading it, but with the Adobe format of the book you can't. This means that you would probably best to choose the shorter loan period because the book will be counting against your loan limit even though you have finished reading it. If one of your borrowed books has expired you can always download it again (unless someone else had borrowed it in the meantime). If you have not deleted the book from your library your stored bookmarks will be maintained.
What did I borrow:
![]() | Decider
by Dick Francis
I picked this first because I expected the content to be undemanding. I was pleasantly surprised by how enjoyable I found the experience of reading the book, both in terms of the physical experience of reading an eBook and the fact that Dick Frances is clearly a very skilled writer. I did not know what the ending was going to be until I reached it, but I knew from experience that it was going to be a happy ending. |
![]() | Breakfast with Anglo
by Simon Kelly
The second book I chose was an account of the recent Irish property bubble as told by a property developer who was personally involved in the centre of the action. It was different from the previous book in that I knew in advance what the ending was going to be and that it was not going to be happy. Nevertheless I found it educational to see how things looked from the point of view of someone who was personally involved. The last chapter was devoted to what lessons he learned from the experience. I know that hindsight is always 20:20 vision, but anyone considering getting involved in property speculation would be well advised to read this chapter. |
![]() | If You Lived Here, I'd Know Your Name
News from Small-Town Alaska
by Heather Lende
The third book I chose was different again. It was an account of what it is like to live in a small isolated town in Alaska. The author worked as an obituary writer for the local paper and the stories in the book seemed to be mainly derived from the information she learned while researching these obituaries. Irish papers tend to only publish obituaries for prominent people, but it seems that in Alaska they publish obituaries for all people who die. This meant that the stories described an eclectic selection of people who lived very different lives. It was ironic to be reading a modern format eBook about people leading an austere life with little access to modern technology, but overall I found it enjoyable. |
In short, I really like this new eBook service. I don't think I will give up reading paper books yet, but I will definitely supplement my reading materials with regular borrowing from the eBook library.
| Reactions: |
Friday, March 9, 2012
The trouble with GRUB
This tool is very useful for people who want to try out Linux, but want the security of being able to easily switch back to Windows if they regret the decision. Although GRUB doesn't officially understand how to control the Windows boot process, there is a well understood trick to allow you include Windows versions in your GRUB boot menu. Since Windows can only be booted from the first partition on your hard disk, you simply need to get GRUB to make the partition containing the version of Windows you want to boot look like the first partition and then GRUB hands control to the Windows boot loader to do the rest. Most Linux installers will automatically configure GRUB for you at install time with a choice of all of the different operating systems found on your different partitions, so you don't really need to learn much about how it works under the covers.
Every time you update your version of the Linux Kernel it is necessary to tweak the configuration files to include an entry for this new kernel, Luckily, the most recent version of GRUB (which is confusingly known as Grub2 although the current version is v1.99) has a great configuration system which includes scripts to automatically rebuild updated configuration files each time you install a new Linux kernel.
With such a glowing praise for GRUB, you might wonder why I entitled this post "The trouble with GRUB". However, there is one minor, problem with the way that GRUB works which is very annoying. The GRUB2 automatic scripts for building configuration files assume that the configuration files should be contained in the /boot/grub/ directory on the partition that the current version of Linux was booted from, but at boot time GRUB might look for a menu definition file in a completely different partition. For example I currently normally run Ubuntu Oneric (v11.10) on my laptop which is installed on partition /dev/sda5, but at boot time GRUB looks for its boot menu on /dev/sda3 (where RHEL 6 is installed) - this means that I need to remember to manually copy the grub menu definitions from /boot/grub/grub.cfg on /dev/sda5 to /boot/grub/grub.cfg on /dev/sda3 or else I will continue to use an old version of the kernel.
This must be a nuisance to many Linux users and not just me. Does anyone know of an easy way to tell grub which partition should be used for storing Partition files? Even better does anyone know how the automated configuration scripts could be updated to figure this out for themselves?
| Reactions: |
Sunday, March 4, 2012
Fantastic Dublin Science Hackday event indicates that ireland has a bright future
![]() |
| Crochet model of a red Blood Cell |
I initially planned to take part in the event myself, but as I got closer to the event the reality of spending 36 hours of my weekend working on a hack began to scare me and I chickened out. Instead I attended the kick-off talks on the Saturday and then gave a few words of encouragement to the participants before I went back to my normal weekend activities. Luckily there was lots of status updates posted to the #dubscihack hashtag on Twitter so I could follow along with the excitement from home. I then returned to DCU on Sunday afternoon to see what had been accomplished. I found the participants very much more subdued (not surprising after 36 hours without sleep), but the projects were very impressive.
You can see all of the details of the projects completed here, of even watch the project presentations on Ustream, but the prize winners were:
- Best use of Government Data went to the YPath project which developed an application for children to track their physical activity.
- Most interesting Use of Data went to the Financial Market Sonification project which produced an audio stream which provides an audio stream that represents a summary of the activity in the market so that traders can have ambient awareness.
- The Hardware Award went to the Aurora Lamp which used LEDs to project information about the level of the Aurora Borealis activity.
- The Design Award went to Open Stats Wiki developed a cool mobile application to allow fans to use their smartphone record statistics of a match the attended live as the action unfolds. This hack will have a bright future because there are many people who share an obsession with sport and an obsession with technology.
- The People's Choice Award went to The Aurora Orrery project build a visualization of from where on the globe the Aurora activity can be seen.
- Last but not least, the hBest in Show Award went to the Elements Trail project which built an augmented reality layer using Layar. to build a treasure hunt game based upon the periodic table
- Tríona (@triploidtree) completed who the CIYbio project (crochet it yourself biology) which involved using crochet to build models of things to do with biology (to help teaching). I am not sure if it was useful, but the models were certainly cute.
- μsic developed a cool application that mixed music listening with social networking.
- The intermeter project projected the level of Network activity from the science hackday into a simple amp meter.
| Reactions: |
Friday, March 2, 2012
Hello! Hello! Can you hear me?
First, let me explain some of the background for why I developed the application. I used to work on the development of a computer telephony system. We were strong believers in the theory that it was important for the developers to get a good understanding of the end users' perspective of the system and so we encouraged all of the development team to use early builds of the system as much as possible.
![]() |
| Some of the Headphones I Use. |
Because of all of these potential problems I often spend the first few minutes of a telephone meeting shouting "Hello! Hello! can you hear me?". If I was speaking to another team member I could expect them to be understanding of this wasted time and/or poor audio quality while I tested several headsets to find which was working best. However, when I was making an important call to someone I wanted to impress, I needed some way to be totally confident that all aspects of my telephony setup were working correctly.
Anyone who uses Skype is probably familiar with the "echo123" virtual user. This is a virtual Skype account that anyone can call and be answered by a pleasant sounding lady who will listen to what you say and then repeat it back to you as it sounds to her. I decided to hack together something similar that could be used with any telephony system. After a bit of searching on the internet I found the voxeo developers site which offeres excellent free resources to anyone wanting to develop voice based applications. Voxeo make their money from providing commercial grade voice response systems to mission critical systems, but in order to convince people how easy it is to develop a user friendly voice interface to their system they give developers free access to their powerful web based development environment and they will even host your application on their test servers so that you can test it out in action.
Voxeo support a number of programming languages including the industry standard VoiceXML. Developing a VoiceXML server is very complex, but the good news is that since Voxeo have done that you don't have to. Developing a voiceXML application is very easy (there are excellent tutorials on the Voxeo site to get your started). I was able to develop my application in under 30 lines of easy to write/understand XML. You can get the full source code here.
The way VoiceXML works is that you specify prompts for the system to play and then you listen for the user to say something (or type a DTMF tone on their keypad). You specify in XML what should be done with the response. You can see I have only one
statement and I use the text to speech function to generate the prompt (it is also possible to record the prompts for a more natural sounding interface).
The only complex line in my code is the one that reads
record name="R_1" beep="true" dtmfterm="true" maxtime="10s" finalsilence="1s" silence="3sTranslated into English this tag means:
- Record what you hear in a file named R_1.wav
- If you hear a DTMF tone, stop recording
- Listen for a maximum of 10 seconds
- If you hear nothing give up after 3 seconds
- If you hear something then terminate when the speaker leaves a gap of 1 second or more
Obviously real world applications could get more complex and if you try to recognize what the user is saying it can get things hilariously wrong when the caller is not a native speaker. But the general idea is not too hard to master. In any case we only need to distinguish between when we hear something so that the "filled" tag applies, or when we hear nothing and the "noinput" tag applies.
If you want to try out the application you can call +1(617)963-0648 to get the version with this source code, or if you prefer the sound of my voice you can call +1(617)500-5332 to hear a slightly modified version where the prompts use a recording of my voice.
| Reactions: |
Wednesday, February 29, 2012
The business case for Open Data
I attended a great seminar yesterday organised by the Irish Software Association on the business possibilities for open data. The room was packed and we got some very good presentations followed by an interesting panel discussion. As a result of this discussion I got a pointer to an excellent EU funded research report that there was almost always a new benefit to a government from either eliminating charges for access to public sector data or reducing the price to a cost recovery level. Similarly, a recent report from Deloitte suggests that open data will be one of the main mechanisms that will enable a recovery from our recent economic woes.
I think this is very timely that this seminar was held in the week leading up to the Dublin Science Hackday, because I am sure many of the Hacks delivered at this event will probably use some of the open public data.
Here are two of the three presentations delivered (I don't have soft copy of the slides delivered by Maurice Lynch of Nathean Technologies)
This first presentation was by Dominic Byrne and it describes some of the great initiatives being done by Fingal county council to promote open government data. I must admit I was initially skeptical when I saw the relatively limited number of feeds cataloged on the Fingal Open Data web site when it launched, but time has proven that it was the correct policy to launch with what was available and then subsequently concentrate on improving it. This is a wonderful local initiative, the only concern I have is that we must not let local initiatives like this blind us to the fact that the real power of open data is its global reach. For example, when I looked for a mobile phone application for DublinBikes, I found that there were no applications written exclusively for DublinBikes but lots of applications written for city bike rental schemes which could be configured to work with bike rental schemes in many different cities including Dublin. From looking at the web sites associated with these applications I noticed that many of them were initially developed for a specific city that the developer was familiar with, but then they soon realized that they could easily broaden the appeal of their app by making it configurable to work with similar bike rental systems elsewhere.
This second presentation was by Jonathan Raper, whose company Placr has made a great business out of exploiting Open Data in the UK. Both during his presentation and during the subsequent panel discussion, he shared some of the lessons he learned from the struggle to promote open access to public data in the UK. One interesting story he told was about what happened when the London Transport authority realized that many applications were being developed that relied upon a feed that they had accidentally made public. Their IT department initially wanted to shutdown the feed because the popularity of the application meant that too much load was being generated on their infrastructure. However, the mayor was aghast at this suggestion because he saw that there would be a huge political backlash if these popular applications suddenly stopped working. Someone pointed out that this was really an online equivalent of the ancient legal principle of "right of way".
| Reactions: |
Monday, February 27, 2012
Less than a week to go until Dublin Science Hackday
Initially I was planning to organise for a team comprised of IBM employees to compete in this event, but the reaction to this suggestion was not very positive. On the one hand, employees who had spend a busy week from Monday to Friday working hard on their technology projects, did not relish the prospect of spending a 36 hour stretch of their weekend working on another demanding project. Also some of the organisers felt that it might not set the right tone to have a team of experienced professionals competing against teams that are mainly comprised of enthusiastic students with little practical experience.
Instead I think I will be arranging for some IBM employees to attend the event as advisers either giving training talks on technology topics that would be useful to hackers and/or informally offering practical help to the project teams as they complete their projects.
In any case I am looking forward to an exciting weekend of technology. I saw the great projects completed at previous Hackdays held in cities such as London, San Francisco and cape Town and I know that the Dublin hackers will be just as good (if not even a little better).
| Reactions: |
Saturday, February 25, 2012
Are you going to tell me or do I have to ask?
Recently we had a training session where the speaker was explaining how to use the various tools and enthusiastically extolling the benefits of using them. Some of the more reluctant participants raised a few issues that they had encountered and we had a lively discussion on whether these new tools actually helped solve the problem of information overload or if they actually made the problem worse because now people are expected to read a selection or blogs and keep up with activity in various Lotus Connections communities as well as reading their email.
I did some thinking about this issue afterwards and I came to the conclusion that social media tools do not inherently solve the problem of information overload, but they do transfer control of the information flow from the sender to the receiver. This is a very significant change, but in order to take advantage of this it is necessary for the receiver to put some thought into what information they actually want to receive and then they also need to put a little bit of work into implementing a process of receiving information which works for them.
Prior to the invention of the internet, people used to communicate via physical letters and newspapers. With letters the sender used to decide whom they would send the letter to and with newspapers the reader would decide what newspapers they wanted to read. There was a cost associated with sending a letter and also a cost to purchase a newspaper so this acted as a natural limiting factor on the amount of information that anyone received. Once the internet was invented, email replaced letters and web sites replaced newspapers.
Since the cost of sending an email is virtually zero people have a tendency to send copies of their emails to more people than just the people who really need/want to receive the information in the email and this leads to information overload. Automated SPAM filters can deal with the really obvious abusers of the email service, but it still is not feasible for most people to read all of the emails they receive each day. If you really want to be in control of your information inflow you need to establish some procedure for deciding what emails you will read (personally I like the GTD system, but I am sure there are many alternatives).
Nobody should limit their information inflows to just the information that other people decide to send them. Therefore you ought to regularly spend some time seeking out new information. Each person will find a different system that works best for them e.g. reading the daily newspaper, reading a series of favourite blogs, logging into Facebook etc. Since everyone has different preferences there is no one system that will work for everyone, but you ought to make a conscious decision about how to go about seeking "news" because the sources of information you rely upon will have an important effect upon how successful you will be in life.
Needless to say another important factor in reducing information overload is that people need to feel empowered and confident enough to tell people "no I didn't read your email". If you do not feel personally empowered to take control of your own information inflow, no tool can help you.
| Reactions: |
Saturday, February 18, 2012
Security for stand-alone Java programs that use the Notes API
Since its launch in the early 1990s, Lotus Notes has had a wonderful security architecture which implemented a public/private key architecture which has only recently been widely adopted by other platforms. Unfortunately this meant that many people are not really familiar with how it works.
I recently blogged about how to write stand-alone Java programs which can manipulate Notes/Domino databases. A few people commented upon the fact that when they run their stand-alone programs on a machine on which they have a Notes client installed they are normally prompted to enter a password each time the program launches. Since they don't get a similar password prompt when they run their program on a machine which has a Domino server installed on it, they seem tro think that the Domino server code gives them some form of privelege to bypass Notes security. This is not true, but to explain why I will first need to explain some important facts about Notes/Domino secuirty model.
- Notes implements what security experts call "two factor authentication". This means that you need to prove your identity by two different mechanisms. Firstly you need to have an ID file containing your private key and secondly you need to know the password used to secure the ID file.
- Domino servers also have ID files to prove their identity. However, most administrators insist that their real human users choose a complex password to secure their ID file and change it frequently, but most administrators don't use any password at all on their servers' ID files. This is because otherwise they would need a human to be present to type in the password every time the server is restarted and in any case sharing the ID file's password with all of the potential administrators would negate any security benefit from having a password.
Domino has the concept of "scheduled agents" which can run on a server in the background and do various maintenance tasks. If these tasks executed with the server's ID it would be necessary to give the server access rights to a lot of databases which would not be very secure. Instead Domino implements a mechanism that these scheduled agents run with the access rights of the user who signed the agent this means that each of the users can run their own version of the agent running with access rights to just their own databases.
| Reactions: |







