I recently blogged about Nintendo Hotspot data and mentioned it could be more usefully consumable in a native mobile app.
As such, I wrote a small Android app for retrieving this data and displaying it on a Google Map. The app shows nearby hotspots, allows users to also search for other non-local places, and shows information on the venue hosting the zone.
Since getting a DS, StreetPass has become quite addictive. It's actually pretty fun checking the device after walking through town or using public transport to see a list of Miis representing the people you've been near recently, and the minigames (such as StreetPass Quest) that require you to 'meet' people in order to advance also make it more involved. Essentially the more you're out and about, the further you can progress - this is further accentuated through Play Coins, which can be used to help 'buy' your way forward and are earned for every 100 steps taken whilst holding the device.
The DS systems can also use relay points in Nintendo Zone hotspots to collect StreetPass hits. These zones are special WiFi access points hosted in certain commercial venues (e.g. in McDonalds and Subway restaurants), and allow you to 'meet' people around the world who also happen to be in another Nintendo Zone at the same time. As such, users can get a lot of hits very quickly (up to a maximum of 10 at a time). There are various ways people have found to set up a 'home' zone, but Nintendo have also published a map to display official nearby zones.
However, their map seems a little clunky to use while out and about, so I wanted to see if there could be an easier way to get this information more quickly. When using the map, the network logs revealed GET requests being made to:
The location for which to retrieve data is specified through the zoom and bbox parameters, which seem to map directly to the zoom level and the bounds reported by the underlying Google Maps API being used. For some reason, the parameter summary_mode=true also needs to be set. As such, a (unencoded) request for central Cardiff may look like this:
Where the coordinates (51.480043,-3.180592) and (51.483073,-3.173028) respectively represent the lower-left and upper-right corners of the bounding box. The response is in JSON, and contains a lat/lng for each zone, a name, and an ID that can be used to retrieve more information about the host's zone using this URL format:
When the map is zoomed-out (to prevent map-cluttering) a zone 'group' might be returned instead of an individual zone, for each of which the size is indicated. Zooming back in to a group then reveals the individual zones existing in that area.
It seems that this server endpoint does not support cross-origin resource-sharing (CORS), which means that the data is not retrievable for a third-party web-app (at least, without some degree of proxying) due to browser restrictions. However, and especially since the endpoint currently requires no session implementation or other kind of authentication, the data seems very easily retrievable and manageable for non-browser applications and other types of systems.
A couple of years ago I wrote a blog post about wrapping some of Weka's classification functionality to allow it to be used programmatically in Python programs. A small project I'm currently working on at home is around taking some of the later research from my PhD work to see if it can be expressed and used as a simple web-app.
I began development in Go as I hadn't yet spent much time working with the language. The research work involves using a Bayesian network classifier to help infer a tweet's interestingness, and while Go machine-learning toolkits do exist, I wanted to use my existing models that were serialized in Java by Weka.
I started working on WekaGo, which is able to programmatically support simple classification tasks within a Go program. It essentially just manages the model, abstracts the generation of ARFF files, and executes the necessary Java to make it quick and easy to train and classify data:
Results from the classification can then be examined, as described.
As is the case with many people, all music I listen to on my PC these days plays from the web through a browser. I'm a heavy user of Google Play Music and SoundCloud, and using Chrome to handle everything means playlists and libraries (and the way I use them through extensions) sync up properly everywhere I need them.
On OS X I use BeardedSpice to map the keyboard media controls to browser-based music-players, and the volume keys adjusted the system as they should. Using i3 (and other lightweight window managers) can make you realise what you take for granted when using more fully-fledged arrangements, but it doesn't take long to achieve the same functionality on such systems.
A quick search revealed keysocket - a Chrome extension that listens out for the hardware media keys and is able to interact with a large list of supported music websites. In order to get the volume controls working, I needed to map i3 through to alsa, and this turned out to be pretty straight-forward too. It only required the addition of three lines to my i3 config to handle the volume-up, volume-down, and mute keys:
bindsym XF86AudioRaiseVolume exec amixer -q set Master 4%+ unmute
bindsym XF86AudioLowerVolume exec amixer -q set Master 4%- unmute
bindsym XF86AudioMute exec amixer -q set Master toggle
And for fun added the block below to ~/.i3status.conf to get the volume displayed on the status bar:
The module is called Web and Social Computing, with the main aim being to introduce students to the concepts of social computing and web-based systems. The course will include both theory and practical sessions in order to allow them to enhance their knowledge derived from literature with the practice of key concepts. We'll also have lots of guest lectures from experts in specific areas to help reinforce the importance of this domain.
As part of the module, I will encourage students to try and increase their web-presence and to interact with a wider community on the Internet. They'll do this by engaging more with social media and by maintaining a blog on things they've learned and researched.
Each week, the students will give a 5-minute Ignite-format talk on the research they've carried out. The quick presentation style will allow everyone in the group to convey what they feel are the most important and relevant parts in current research across many of the topics covered in the module.
We'll cover quite a diverse range of topics, starting from an introduction to networks and a coverage of mathematical graph theory. This will lead on to social networks, including using APIs to harvest data in useful ways. Over the last few weeks, we'll delve into subjects around socially-driven business models and peer-to-peer finance systems, such as BitCoin.
During the course, I hope that students will gain practical experience with various technologies, such as NetworkX for modelling and visualising graphs in Python, Weka for some machine learning and classification, and good practices for building and using web APIs.