Showing posts with label raspberry PI. Show all posts
Showing posts with label raspberry PI. Show all posts

Wednesday, November 2, 2016

Automatic watering system using Gardena 1197 + OpenHab + Pilight

I was tired of having the repetitiveness of watering my garden every day so I wanted a solution to manage the garden watering. It was not as simple as just using one circuit because my garden is spread out, so different areas need different times and water pressure is a problem. I also did not want to dig my garden up to lay down new water pipes, so I wanted to be able to use my existing garden hose's.

The solution cost was to use Gardena automatic water distributor along with an single electronic valve from Hunter. This keeps the electrics simple and the connections plug and play. The control was done with openhab running on a raspberry pi, which triggered the valve via a remote 433mhz switch. You could obviously use a valve per circuit, but this requires another bridge to the PI and I had the 433Mhz already set up with pilight. As it is below you can also switch the valve with an electric pump, which I sometimes do when I have collected enough rain water.

Parts list:
1.) Remote switch - 25 Euro - Although I had this already
2.) Gardena automatic water distributer - 55 Euro
3.) 24 V AC Transformer - 12 Euro
4.) Hunter PGV-100 valve - 16 Euro - basically cheaper than Gardena version
5.) Raspberry PI - 40 Euro - although had this already as well
6.) Aurel 433Mhz transmitter
7.) Various garden houses to different sections of the garden, 4 circuits in use, but up to 6 can be used


View of the switch, the transformer and hunter valve

Garden Distributor
The Gardena distributor works quite simply by switching to the next circuit when there is no water pressure for a certain period of time (I was using about 10 seconds). So with a on-off-on-etc pattern you can control up to 6 circuits with different time periods if you want.

View of control panel in Openhab basic UI
Pilight needs to be installed and the pilight binding needs to be setup inside openhab

addons.cfg:
    pilight:kaku.host=192.168.2.168
    pilight:kaku.port=5000
    pilight:kaku.delay=1000

Pilight devices:
     "devices": {
                "desklamp": {
                        "protocol": [ "kaku_switch" ],
                        "id": [{
                                "id": 15831690,
                                "unit": 1
                        }],
                        "state": "off"              
        },

Items file:
Switch  KakuDeskLamp    "Water"               {pilight="kaku#desklamp"}
Switch  WaterTest    "Water Test"
Switch  StartProgram    "Start Program"
Switch  KakuSmallBathFan    "Small bath fan"               {pilight="kaku#smallbathfan"}
Number      Irrigation_LawnMins     "Lawn Sprinkler [%d mins]"  <water>         (Irrigation)
Number      Irrigation_Pause     "Break [%d mins]"  <water>         (Irrigation)
Number      Irrigation_Multiplier     "Multiplier [%d mins]"  <water>         (Irrigation)
Number      Irrigation_Circuits     "Circuits [%d]"  <water>         (Irrigation)
Number      Irrigation_ActiveCircuit     "Circuits [%d]"  <water>         (Irrigation)

Number      Irrigation_Repeats     "Repeats [%d]"  <water>         (Irrigation)

Openhab rules:
    rule "Enabling irrigation"
    when
    Item StartProgram changed from OFF to ON or
    Time cron "0 45 7 1/1 * ? *"
    then
      var i = 0
      var Number lawnMins = Irrigation_LawnMins.state as DecimalType
      var Number pause = Irrigation_Pause.state as DecimalType
      var Number Multiplier = Irrigation_Multiplier.state as DecimalType
      var Number Circuits = Irrigation_Circuits.state as DecimalType
      var Number Repeats = Irrigation_Repeats.state as DecimalType
      lawnMins = lawnMins*1000
      if (Multiplier != 0)
      {
        lawnMins = lawnMins*Multiplier
      }
      logInfo("Irrigation", "Circuits " + Circuits + " multiplier " + Multiplier)
      while ((i=i+1) < Circuits+1) {
        var j = 0
        while ((j=j+1) < Repeats+1) {
                logInfo("Irrigation", "Repeat " + j)
                sendCommand(KakuDeskLamp, ON)
                Thread::sleep(100)
        }
    
        logInfo("Irrigation", i + " lawnMins " + lawnMins)
        sendCommand(Irrigation_ActiveCircuit, i)
        Thread::sleep(lawnMins.intValue) //*lawnMins)
        j = 0
        while ((j=j+1) < Repeats+20) {
                logInfo("Irrigation", "Repeat " + j)
                sendCommand(KakuDeskLamp, OFF)
                Thread::sleep(100)
        }
   
        sendCommand(Irrigation_ActiveCircuit, 0)
    
        Thread::sleep(1000*pause.intValue)
      }
      sendCommand(StartProgram, OFF)
    end


Sunday, November 10, 2013

Raspberry Pi Stereo Camera

Stereo camera with 2 raspberry pi's
So I made my first stereo camera this weekend with 2 raspberry pi's. It actually worked out pretty easily. The exact stereo angle of the camera's is not exact and only controlled with pieces of paper and elastic bands. The blue bands in the middle and paper on the outer edge tilt the camera a little more inwards. Here is a example anaglyph stereo (you will need red/blue stereo glasses to view it properly):

It would be best to have exact screws which you can use to adjust the angles and such. There are small holes on the camera that would allow these screws to be attached, so its just a matter of finding the right adjustable screws. Although thinking of that now, it should not be a big thing. I know the stereo alignment is a bit funny, which should be fixed, but my eyes where able to find the right focus and you can see the 3D effect quite nicely. The code is up at https://github.com/arcanon/raspbot. It won't compile out of the box, but have a look at video reader for the capture loop/anaglyph composition.

The CUDA kernel that composites the kernel looks like this:

__global__   void anaglyph_dev(char* imageLeft, char* imageRight, char *imageOut, int pitchInputs, int pitchOutput, int width, int height)
{
    int x = blockIdx.x * blockDim.x + threadIdx.x;
    int y = blockIdx.y * blockDim.y + threadIdx.y; 

    if (x >= width || y >= height)
        return;

    int linearPosInputs = y*pitchInputs + x;
    int linearPosOutput = y*pitchOutput + x*4;

    // Red
    imageOut[linearPosOutput]   = imageLeft[linearPosInputs];
    imageOut[linearPosOutput+1] = 0;
    imageOut[linearPosOutput+2] = imageRight[linearPosInputs];
}

Monday, September 9, 2013

A New Build, speed control and atmega/Aduino read-back to raspberry PI

So I finally got some time to provide the updated info for my rasp bot. In this post, I give updates on how I have rebuilt the bot to have drive more accurately (the last version with such small wheels did not fair so well on different surfaces and the motors were not great) and I talk about the SPI enhancements for speed control and voltage read back.

The new build:

1.) a left and right servo for the wheels
2.) much larger wheels
3.) a wooden chassis to hold bread board and camera

I learnt during the upgrade, that wood and elastic bands are really good materials to use during building custom projects. Wood because it's relatively easy to form and elastic because you can hold the different pieces together easily without having to glue everything permanently. Later you can then reconfigure things with ease. 

In this version, I switched back to a mini bread board, so that I can add things to the circuit more easily. The last version was all soldered together, which was great fun, but really not practical in the long run when you are still trying things out. For the motors, I bought some servos, which are normally intended for controlled angler movement, but I removed these circuits and restrictions. Then I use them as plain DC motors with gear. I liked this solution because the servos were relatively cheap and provide good power. The big wheels I found a model shop around the corner. They are pretty much directly attached to the servos (the servos came with small plastic discs that I could glue to the wheels with a CD providing a in surface between the wheel and the servo disc attachment). This is also really nice because was quite difficult to find a simple gear system for connecting a DC motor to wheels. Mostly others buy a remote toy and modify that because then all the parts are pretty much there. However, you are not able the to build your own design.

Lastly, I now have the new raspberry pi camera. It's great for use with raspivid and streaming. You can get great video and choose your compression(a problem when you only have 440KB/s dsl uploads). That's all stuck or bound together as shown in the picture.

Speed control and voltage read back:

I was having some issues with capturing the video and analysis thereof when moving. The issue Is that with movement it takes a while for the stream to settle. Otherwise there is just to much noise or blurred images(especially in low light) to make reliable descisions). one solution is to make the movement less jerky, as up until now I was just turning the motor on and off (with an input voltage of 4.8v). I did reduce the input to just 3.6, which helped, but still was not good enough. So I changed the code to send a speed value to the atmega with every movement control. The atmega then uses this value to toggle the time which the enable pin of the hbridge(L293), so that it is only on a certain percentage of the time. This is basically software PWM (pulse wave modulation). The atmega does this by counting loop cycles and switching the enable pin on for the number of loop cycles sent from the python script running on the PI. I could also use the atmegas PWM pins, but those conflict with the SPI pins (the MSO and SS pins are 2 of the 3 PWM pins).

I have updated the source https://github.com/arcanon/raspbot/blob/master/Blink.ino and the python script, https://github.com/arcanon/raspbot/blob/master/ardspi.py. The SPI read back is done by using the quick2wire duplex method. Note that 1 byte from the slave is read back at the exact same time as 1 byte from the master. This means the slave needs to pre-prepare the data to be sent or you discard the first byte. I have just done the later for now. I send back the 10 bit conversion of the voltage measured by the adruino on a pin which is connect to the positive battery supply of the motors (3.6V) via a 10K resistor. The same connection is connected via another 10K resistor to ground. The voltage is then measured at 1.8V, divided as expected. I have not done time measure measurements to see how this varies over time.

The python script also contains some networking code, which I will write about in another post. That code allows the PI to take to a server which is capturing the video stream (from raspivid) and then doing various video analysis. I can then make use of a dekstop GPU to improve the speed of the analysis for object detection and such.