The idea behind Ubiquitous Computing is that the third wave of computers is coming (or already here). The first wave was Mainframe Computing: computers are large machines only found in small quantities at large companies, and have several users each. The second wave is Personal Computing: every home or at most person has a "desktop" computer. The third wave is Ubiquitous Computing: there are (many) more computers than users. Mobile Computing that we are seeing today, where users have several devices (computer, tablet, phone, car system...) is an intermediate between the second and third wave. "Real" Ubiquitous Computing can encompass a lot of things: sensor networks, intelligent dust, robotics, augmented humans... Basically it is the vision of a future where computation is everywhere. Regarding programming, the shift is that, in earlier waves, an application was running on a computer. In Ubiquitous Computing this is no longer possible, as there may well be fewer applications than devices. An application runs in *several* computers. This is why the idea of the Cloud is (kind of) related to Ubiquitous Computing. So learning how to program for Ubiquitous Computing is first and foremost learning how to write distributed algorithms. I have no idea what your current level is, but if you are at least an intermediate level programmer a good start is the problem of consensus and the Raft algorithm. You would also benefit from learning Erlang and/or playing with systems like 0MQ / nanomsg. Other things will be important though. You cannot have hundreds of devices per user that each consume 50W of power, so some of the programming will have to be energy-conscious. That means it will be very low-level. Odds are that it will be closer to circuit design actually, so I'd say learning about things like FPGAs and VHDL could be a good thing. Playing with Arduinos as well.