drgb camera

I just bought myself an Asus Xtion PRO Live Depth+RGB camera which I plan to use for robotics experiments. It uses the same technology from PrimeSense for depth as Microsoft Kinect but is about half the size, can be powered solely over USB and weighs around 170g which makes it a better match for robotics.

asus xtion pro live vs. matchbox

xtion_pro_on_wild_thumper_20111221_003

Here are my notes on getting the basic Openni / NITE demos running on ubuntu 11.10:

sudo apt-get install build-essential libusb-1.0-0-dev freeglut3-dev

install openni

mkdir openni
cd openni
git clone https://github.com/OpenNI/OpenNI.git
cd Platform/Linux-x86/CreateRedist
./RedistMaker
cd ../Redist/
sudo ./install.sh

install sensor

git clone https://github.com/PrimeSense/Sensor.git
cd Sensor/Platform/Linux-x86/CreateRedist/
./RedistMaker
cd ../Redist
sudo ./install.sh

install primesense NITE. This seems to be closed source but free of charge
download from http://www.openni.org/Downloads/OpenNIModules.aspx under Middleware binaries. In my case it looks like this:

tar -xf nite-bin-linux64-v1.4.2.3.tar
sudo ./install.sh

this will prompt you for a key, which is: 0KOIk2JeIBYClPWVnMoRKn5cdY4=

then go to directory containing NITE samples and try out some demo apps for example Sample-Players:

cd Samples/Bin/Release
./Sample-Players

This is how SamplePlayers looks when it has identified me in the picture:
openni nite SamplePlayers demo

on durability of the linksys wrt54g

I have two Linksys WRT54g wifi APs at my grandparents’ place that have been running since May 2005. The one facing the Internet is connected to a 24 dB parabolic wifi (2.4Ghz) antenna that is directed towards a tower 8km away and the other one is connected to an omni antenna and is configured as an open access point.

The Internet is somewhere there

The other end of the link as (not) seen from the position of our antenna. It's usually not visible with the naked eye in the daylight but the lights are visible at night.

wifi antennas

Our improvised antenna tower

These two WRT54g-s are situated in a garage and do not have any special kind of enclosure. The garage is not heated and has always basically the same temperature that is outside. So it has seen days with temperatures below -30 C and several weeks with the temperature constantly around -20 C. It has pulled through all this without any problems. I only had to replace a couple of power supplies so far because of the lightning but otherwise that installation has required no attention whatsoever. I wish all of my gear was that reliable.

Actually it has required so little attention that it has been years since I last attempted to log into these routers. They are still running an ancient OpenWRT build from 2005. It turns out that the modern versions on the openssh client are not able to work with the old version of the dropbear ssh daemon that I have in the AP’s.

The openssh client in the verbose mode will give you something like this if you try:

debug1: Remote protocol version 2.0, remote software version dropbear_0.45
debug1: no match: dropbear_0.45
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_5.8p1 Debian-1ubuntu3
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug1: kex: server->client aes128-cbc hmac-md5 none
debug1: kex: client->server aes128-cbc hmac-md5 none
debug1: sending SSH2_MSG_KEXDH_INIT
debug1: expecting SSH2_MSG_KEXDH_REPLY
Connection closed by 192.168.9.1

A nice and simple workaround that I found is to use the SSH client that the dropbear package itself provides (called dbclient).

PS. I plan to retire one of these APs soon because I have a Ubiquity Bullet 2 lying around that is a much better fit for such environments and I believe that it’s better to replace these before the inevitable failure comes along.

outdoors robot navigation

wild_thumper_with_sonar

I spent some time experimenting with SRF-08 sonar for outdoors robot navigation usage. While this sonar is said to have range of 6m I haven’t gotten any measurements beyond ~1m even indoors.

Outdoors it’s of course even more unreliable and there the usable range at the 20cm height seems to be about 30cm, which depends a bit on height of the grass and evenness of the terrain. Since the speed of this robot is about 5 km/h I have about 200 ms from the beginning of the measurement cycle to actually hitting the obstacle. This might be just about enough if I turn up the ping frequency to 10ms range and avoid doing anything else in the main loop.

Another option is of course to cap the robot speed to something slower but that wouldn’t be much fun.

I wonder if MaxSonar WR range would offer better range in this scenario, especially the models with narrower beam width. For some reason I couldn’t find any actual reviews of its outdoor performance and 100$ seems a bit much just for finding out.

greenhouse window automation

With a lot of help from my friend Kalle (actually he did ~95% of the work) we managed to get window automation running.

We haven’t had time to repackage and rewire the control panel so it looks a bit hackish at the moment
greenhouse controller

The windows are actually controlled from the Beagleboard in the garage over WiFi. The Beagleboard also has an IPv6 address so we are able to directly access it from anywhere in the world even though it sits behind several NAT gateways.

Here’s a high level overview of how the automation is set up at the moment:

high level automation overview

And here’s a bit more detailed schematics of how the electronics in the greenhouse are connected:
high level overview of greenhouse electronics

Wild Thumper

One of my long term goals is to create an outdoor robot that would automate certain kinds of gardening tasks. I have built some simple indoor robots over the years but haven’t really gotten around to building something that can be used outdoors since there’s a rather large gap in the complexity between these two environments.

For indoor robots you can basically just use toy car wheels directly attached to the servos and use simple wheel encoders to navigate. Outdoors you need a lot bigger wheels, DC motors, motor controllers, suspension, large batteries and preferably a shell that provides protection against the elements. Navigation also becomes a challenge – wheel encoders become rather useless because of the constant slippage and GPS is currently still far too imprecise for navigating in a garden.

So to avoid spending too much time on the mechanics I decided to get Dagu`s Wild Thumper 6WD platform which can handle uneven terrain well, as can be seen from the following demo video from Dagu:

Here’s a picture of my Wild Thumper with Wild Thumper motor controller & 5000 mAh Li-ion battery:
wild_thumper_6wd

Here’s my own first radio controlled test run:

And here’s a demo of the steering with Nokia N900 accelerometers:

By the end of this year I hope to get it autonomously navigating in my garden which is a precondition for most of the interesting applications but is a very complicated task all by itself too.

smart energy

Lately I have been hacking on the home automation project which should at some point tie together my current ad-hoc house and greenhouse control systems. One part of that system will be energy usage optimization with wireless smartplugs, which uses Plugwise smartplugs and the python-plugwise library that I wrote. The main (and more or less the only :-)) appliance that I can make energy savings on using automation is the electric boiler. It’s an old 40-50l heater that hasn’t been cleaned for at least 3 years so it probably contains a lot of limescale and sludge which causes it to use far more power that it should. Hopefully I will get around to cleaning it soon.

I haven’t implemented any fancy logic for controlling it yet and instead I just use a simple crontab that switches the boiler on for 3 hours total each day – somewhere around when we leave for work and again before we get back home.

So far it seems to be enough to provide hot water for 2 persons on working days and it wastes around 2 times less power than it would if it was permanently switched on:
boiler_energy_usage

So assuming this difference holds true for larger periods of time I will get energy savings of ~4kWh per day. If I only use this control method on working days it will give me somewhere around 80kWh savings per month. Currently I pay 0.1026 € per kWh so I should save somewhere around 8 € per month, and 96 € per year on this appliance. So these smartplugs should pay for themselves in about 1.5 years. Currently Estonia has the cheapest price of electricity in Europe which can only go up as the energy market opens up and hidden subsidities are removed so the savings will probably be even more substantial in the not so distant future.

valimised

Eile toimusid riigikogu valimised ja tegu oli juba teiste valimistega, kus valimiskommisioni leht mingil hetkel loobus värske info edastamisest.

Helmes, kes antud tarkvara teinud on tuli täna välja huvitava patuoinaga – jamades olevat süüdi avatud lähtekoodiga andmebaasimootor PostgreSQL, nende poolt tehtud tarkvara töötas perfektselt ja adekvaatset jõudlustesti ei tehtud kuna see olla võimatu.

Esiteks tundub siin äärmiselt kohatu PostgreSQLi süüdistamine, selle peal käib maailmas väga palju süsteeme, mille andmemahud ja koormused on võrratult suuremad sellest, mida see valimissüsteem oleks pidanud kannatama (no kasvõi näiteks Skypei kasutajate baas on PostgreSQLi peal). Mulle isiklikult tundub, et antud juhul oli Postgresi süüdistada lihtsalt palju mugavam, kui öelda, et me ei testinud ega seadistanud asja piisavalt, sest näiteks erinevalt Oraclest ei ole siin taga kedagi kes sind siinkohal laimu eest kohtusse kaebaks.

Teiseks öelda, et meie tarkvara töötas ideaalselt, ikaldus vahend X mida me kasutasime on üsna kohatu, kuna arendaja vastutab üldiselt ikka terviku eest. PostgreSQL on ennast maailmas piisavalt tõestanud, küsimus tundub olevat puhtalt rakenduse arhitektuuris ja/või serveri seadistustes. Siinkohal oleks tore kuulata Hannu Krosingu või mõne teise postgresi guru kommentaari.

No ja viimaseks jutt, et “Omalt poolt olime kõik ära testinud ja kontrollinud ning enam midagi teha ei saanud” – antud rakenduse testimine peaks täiesti reaalse koormuse juures olema üsna lihtne. Eesti oma ~600 000 häälega on ikka imepisike asi simuleerimiseks. Hiinlastel oleks ehk sutsu raskem 😛

Aga, et see ei jääks tühjaks targutamiseks, siis viskasin hommikul rongis tööle sõites kokku naiivse valimise rakenduse, et vaadata palju sellise baasi täitmine ja hilisem võitjate selgitamine sellise baasi pealt aega võtaks suvalisel desktop masinal.

Kõigepealt tuleb teha mõned eeldused:

Teen ainult häälte (votes) ja kandidaatide (candidates) tabelid. Tegelikult peaks tabeleid olema muidugi rohkem – valimisnimekirjad, erakonnad, ringkonnad, valimisjaoskonnad ja ilmselt veel mõned, mis esimese hooga pähe ei tule. Neid tabeleid võib aga rahus ignoreerida, kuna väljaarvatud häälte tabel peaks muu olema üsna konstantne ja eeltäidetud.

Teen eelduse, et iga hääl on eraldi kirje votes tabelis. Ilmselt praktikas nii ei ole ja pigem teatab valimisjaoskond häälte arvu ühe kirjena kandidaadi kohta a’la kandidaat_X sai 1000 häält. See oleks jõudluse mõttes oluliselt lihtsam, kuna 600 000 inserdi asemel oleks neid pigem kuskil 50 000 ringi. Teen sihilikult jõudluse mõttes oluliselt hullema variandi, et näha palju see aega võtaks.

Teen eelduse, et valimisjaoskond teatab kõik oma hääled korraga. St. iga hääle sisestamine ei ole omaette transaktsioon vaid pigem on seda kõigi ühe valimisjaoskonna häälte sisestamine.

Eeldan, et veebis kasutajale graafikute ja statsi näitamist ei tehta otse andmebaasi pealt vaid pigem genereeritakse staatiline leht näiteks kord minutis. Ei tundu olevat põhjust, miks peaks kasutajale näidatav leht üldse andmebaasi vastu käima ja ilmselt nii oligi tehtud sest probleemide ajal tuli leht endiselt kiirelt ette, lihtsalt vanade andmetega. See tähendab, et mul pole vaja emuleerida kuidas paarsada tuhat erinevat select päringut sekundis baasi pihta käivad.

Üldiselt nende selgitust lugedes jääb mulje, et küsimus oli selles et query planner tegi otsuseid vana tabeli statistika pealt (VACUUM ANALYZE’i polnud vahepeal käivitatud) mistõttu eelistati ebaefektiivsemat käivitusplaani. Näiteks, kui tabelis on mõnisada kirjet võib igati mõistlik olla kasutada tabeli käigi ridade läbikäimist (full scan) indeksi poole pöördumise asemel. Artiklist jääb mulje, et hetkel kui jama tekkis vaatasid adminid käimasolevaid päringuid ja nende execution plani ja andsid VACUUM ANALYZE ja siis ootasid tunnikese, et päringu käitusplaan muutuks. Selleks, peab andmebaasi IO ikka ülimalt ülekoormatud olema, et VACUUM ANALYZE sellise aja võtaks. Näiteks minu 600 000 kirjega häälte tabelil võttis tavalisel desktop masinal ~2s.

Schema niisiis selline:

CREATE TABLE candidates(
    candidate_id INT PRIMARY KEY, 
    name text
)
 
CREATE TABLE votes(
    electoral_district_id INT NOT NULL, 
    candidate_id INT NOT NULL REFERENCES candidates(candidate_id)
)
 
CREATE INDEX idx_candidate_id ON votes(candidate_id)

Ja script, mis “valimistulemusi” sisestab on siin.

Ja aega võtab sellega 600 000 hääle sisestamine veidi alla 4 minuti:

hadara@hadara-desktop:~$ python elections.py 
        candidates inserted
        votes inserted
tables filled in: 211.85s

Tegu niisiis tavalisel desktop masinal suht vaike seadistustega jooksva PostgreSQLiga (shared_buffers keeratud 256MB peale, reaalsetes serverites ilmselt pigem 4+GB).

Sellise baasi pealt võitjate pärimine võiks välja näha näiteks nii:

elections=# SELECT votes.candidate_id, COUNT(*) AS votecount,(SELECT name FROM candidates WHERE candidates.candidate_id=votes.candidate_id) AS candidate_name FROM votes GROUP BY votes.candidate_id ORDER BY votecount LIMIT 10; candidate_id | votecount | candidate_name 
--------------+-----------+----------------
          106 |       600 | candidate_106
          120 |       600 | candidate_120
          285 |       600 | candidate_285
          681 |       600 | candidate_681
          866 |       600 | candidate_866
          264 |       600 | candidate_264
          887 |       600 | candidate_887
          601 |       600 | candidate_601
          664 |       600 | candidate_664
          251 |       600 | candidate_251
(10 ROWS)

See päring võtab 146ms ja execution plan on selline:

elections=# EXPLAIN analyze SELECT votes.candidate_id, COUNT(*) AS votecount,(SELECT name FROM candidates WHERE candidates.candidate_id=votes.candidate_id) AS candidate_name FROM votes GROUP BY votes.candidate_id ORDER BY votecount DESC LIMIT 10;
                                                                     QUERY PLAN                                                                     
----------------------------------------------------------------------------------------------------------------------------------------------------
 LIMIT  (cost=19956.81..19956.83 ROWS=10 width=4) (actual TIME=146.603..146.605 ROWS=10 loops=1)
   ->  Sort  (cost=19956.81..19959.31 ROWS=1000 width=4) (actual TIME=146.602..146.603 ROWS=10 loops=1)
         Sort KEY: (COUNT(*))
         Sort Method:  top-N heapsort  Memory: 25kB
         ->  HashAggregate  (cost=11655.00..19935.20 ROWS=1000 width=4) (actual TIME=144.690..146.393 ROWS=1000 loops=1)
               ->  Seq Scan ON votes  (cost=0.00..8655.00 ROWS=600000 width=4) (actual TIME=0.005..36.190 ROWS=600000 loops=1)
               SubPlan 1
                 ->  INDEX Scan USING candidates_pkey ON candidates  (cost=0.00..8.27 ROWS=1 width=13) (actual TIME=0.001..0.001 ROWS=1 loops=1000)
                       INDEX Cond: (candidate_id = $0)
 Total runtime: 146.651 ms
(10 ROWS)

Valimistega ma kuidagi seotud pole ja PostgreSQLi näpisin viimati umbes 8 aastat tagasi. Seega üsna puusalt tulistamine.

Uuendus: Martin Rebane, kellele on antud minust oluliselt rohkem kirjanikuannet, on ka samal teemal kirjutanud.

winter

Temperature is still down to around -30 degrees C at night and pipes sometimes tend to get frozen but at least the nature looks really nice.

suusarada_small

hingu_oja_small

And it’s time to get ready for the spring…

plants