Showing posts from November, 2015

How to calculate a robot’s forward kinematics in 5 easy steps

Calculating the forward kinematics is often a vital first step to using a new robot in research. While there are some good tutorials available online, up until now there hasn’t been a simple step-by-step guide for calculating forward kinematics. In this post, we provide a simple guide along with some tips on calculating the kinematics of any robotic manipulator. Calculating kinematics is a cornerstone skill for robotics engineers.  Kinematics can sometimes be a pain, however, being told to “go and calculate the Forward Kinematics” is almost robotics research shorthand for “go and get familiar with this robot”. It’s the vital first step when using any new robot in research, particularly for manipulators. Even though I had learned the theory of kinematics in university, it wasn’t until I had calculated various kinematic solutions for a few real research robots that the whole process started to feel intuitive. Even then, because I was not calculating kinematics every day I had to g

Robots at #ERW2015: From imagination to market

Bridging the gap between cutting-edge research in academia and the vibrant robotics startup ecosystem is no easy task. This Wednesday in the UK city of Bristol, a free public event titled “From Imagination to Market” — the centre piece of European Robotics Week 2015 — took on that challenge by bringing together leading innovators, researchers, startups and strategists. Below are the key moments and insights from the event. Organised by euRobotics AISBL (which represents robotics stakeholders in Europe) and the Bristol Robotics Laboratory (BRL — one of the largest facilities of its kind in Europe), this event was packed insight, stories and advice. The day kicked off with an introduction to robots in science fiction by Michael Szollosy and a look at the field of RoboLaw by Andrea Bertolini, before taking a deep dive into the BRL’s state-of-the art research on swarms, human-robot interaction and soft robotics. Then we heard from up-and-coming startups Open Bionics and Reach Rob


Mining for resources hidden under the surface of the earth has never been a job devoid of hazards. You can start with the fact the it’s under the earth, and there’s no natural light, and the spaces can get tiny so it’s clearly not a job even for the mildly claustrophobic. Then there’s the danger of cave-ins that do happen even with all the modern marvels that we have access to thanks to advances in structural engineering. When it comes to coal mining, however, the danger is several times greater due to added danger of highly flammable gases in certain parts. Other than the Japanese using them for what basically amounts to some amusing (robotic pets), and some disconcerting (robots that look like creepy versions of children or grown women, etc) toys, the use of robotic technology has almost overwhelmingly been either to go where man hasn’t or cannot go, or in some other cases to augment human ability to do what was hitherto very difficult. We have robotic rover vehicles landing on

Bosch's Giant Robot Can Punch Weeds to Death

At  IROS last month , researchers from a Bosch startup called Deepfield Robotics presented a paper on “Vision-Based High-Speed Manipulation for Robotic Ultra-Precise Weed Control,” which has like four distinct exciting-sounding phrases in it. We wanted to write about it immediately, but Deepfield asked us to hold off a bit until their fancy new website went live, which it now has. This means that we can show you video of their enormous agricultural robot that can autonomously detect and physically obliterate individual weeds in a tenth of a second. The stamping tool is 1 centimeter wide, and it drives weeds about 3 cm into the soil. It’s designed to detect (through leaf shape) and destroy small weeds that have just sprouted, although for larger weeds, it can hammer them multiple times in a row with a cycle time of under 100 ms. Testing on a real carrot crop, which has carrots spaced about 2 cm apart and an average of 20 weeds per meter growing very close to the carrots themselv

HoloLens needs more work, but using it with 'Minecraft' is so damn cool

Before going any further, let's be clear: HoloLens is clearly unfinished. You'll see a decent deal of enthusiasm below, but the head-mounted display (HMD) in its current form is in no way ready for mass consumption. It's the eye-opening possibilities these early HoloLens experiences deliver that drive much of the excitement. What  is  this thing? HoloLens is an augmented reality visor. Unlike virtual reality, which plunges the player into a private, immersive experience with the help of a completely enclosed headset, AR devices provide what amounts to a personal heads-up display. Unlike VR, you can see the world in front of you. An AR headset's lenses are best thought of as miniature see-through displays that throw computer-generated imagery on top of what your eyes would normally see. In its present state, an extremely limited field of view is the biggest limitation facing HoloLens. Imagine a small movie screen positioned inches away from your face; any A