Subscribe: Chris Anderson's Posts - DIY Drones
http://diydrones.com/profiles/blog/feed?user=zlitezlite&xn_auth=no
Added By: Feedage Forager Feedage Grade A rated
Language: English
Tags:
allowfullscreen  drone  drones  flight  frameborder  https youtube  https  new  opaque frameborder  wmode opaque  youtube embed  youtube 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Chris Anderson's Posts - DIY Drones

Chris Anderson's Posts - DIY Drones





Updated: 2017-11-20T06:08:22Z

 



Great YouTube channel on using Dronekit, APM

2017-11-19T23:02:04.000Z

allowfullscreen="" frameborder="0" height="789" src="https://www.youtube.com/embed/iL5DIqL9qdE?wmode=opaque" width="1903">

Tiziano Fiorenzani, an Italian engineer now working in the US, has one of the best YouTube channels on using open source drone software, especially, APM, Dronekit and Python. Above is just one example, on Drone Delivery with Python:

We are going to write a script that connects with the vehicle and waits for the operator…

width="1903" height="789" src="https://www.youtube.com/embed/iL5DIqL9qdE?wmode=opaque" frameborder="0" allowfullscreen="">

Tiziano Fiorenzani, an Italian engineer now working in the US, has one of the best YouTube channels on using open source drone software, especially, APM, Dronekit and Python. Above is just one example, on Drone Delivery with Python:

We are going to write a script that connects with the vehicle and waits for the operator to upload a valid mission. Then the script adds our current location as final waypoint and the vahicle is commanded to arm and takeoff. The vehicle is then set to Auto and once the final waypoint is reached, the script deletes the mission and sets the vehicle in Return to launch mode. At the end the script resets its status and is ready for new adventures!

See the rest of his videos on his YouTube channel here




DIY Drones now at 86,000 users

2017-10-20T20:20:54.000Z

(image)

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 86,000 members! We're also more than ten years old!

Rather than simply give the usual monthly traffic snapshot, I thought I'd give the data…

(image)

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 86,000 members! We're also more than ten years old!

Rather than simply give the usual monthly traffic snapshot, I thought I'd give the data for the whole decade, which tells quite a story. 

  • First, some amazing totals:
    • More than 20 million users and 118 million pageviews over the decade. 
    • 13,400 blog posts
    • More than 60,000 discussion threads
    • Nearly a million comments
  • Second, the ups and downs of this industry. Over the ten years, we've gone from one of the few drone communities around to today, when there are hundreds of sites, most of them commercial, and drone users and developers are scattered amongst them. In the early 2010s, DIY Drones was in the top three results on Google for "drones". Now there are pages and pages of commercial sites before it. That's a natural thing and demonstrates classic maturing of an industry. The amateurs have given way to the pros.
  • Third, the related rise and fall of "DIY" in the drone industry. With the triumph of DJI and its Phantom (and now Mavic and Spark) lines, it's no longer necessary to build your own drone. This is a good thing (the same happened with PCs and all sorts of electronics before it), and many people still choose to do so anyway for fun (as they still do with PCs), but it's clearly gone back to a niche activity or one for developers, much as it was in the early days. 

Today, we're still a big community with healthy traffic (about 10,000 visitors and 15,000 page views a day). And we'll continue just as we are for many years to come. We won't be the biggest site in this space, but we'll continue to be one of the most interesting and a friendly, high-quality place to talk about ideas and projects that extend of potential of drones to change the world. And have fun doing it!




Blimpduino flies again!

2017-10-03T22:37:45.000Z

Ten years after this site started with its first project, a robotic blimp called Blimpduino, two long time friends here, Jordi Munoz and Jose Julio, are relaunching it in a new improved form. It's called, unsurprisingly, Blimpduino 2.0, and you can see the progress here.  The is new… Ten years after this site started with its first project, a robotic blimp called Blimpduino, two long time friends here, Jordi Munoz and Jose Julio, are relaunching it in a new improved form. It's called, unsurprisingly, Blimpduino 2.0, and you can see the progress here.  The is new one is based on fly-by-wire Wifi system (smartphone based) and supports computer vision with the OpenMV camera.  It's what we had in mind a decade ago, but just took years for the technology to catch up.  It looks like it will be available in a couple months. Can't wait! [...]



"MVP" computer vision rover with OpenMV for less than $90

2017-10-02T04:30:00.000Z

(image)

This is the cheapest good computer vision autonomous car you can make — less than $85! It uses the fantastic OpenMV camera, with its easy-to-use software and IDE, as well as a low-cost chassis that is fast…

(image)

This is the cheapest good computer vision autonomous car you can make — less than $85! It uses the fantastic OpenMV camera, with its easy-to-use software and IDE, as well as a low-cost chassis that is fast enough for student use. It can follow lanes of any color, objects, faces and even other cars. It's as close to a self-driving Tesla as you’re going to get for less than $100 ;-)

It’s perfect for student competitions, where a number of cars can be built and raced against each in an afternoon.

width="570" height="321" src="https://www.youtube.com/embed/3G4eN2A52ic?wmode=opaque" frameborder="0" allowfullscreen="">

Instructions and code are here




Eagles vs Drones (spoiler: Eagles win)

2017-10-01T00:08:48.000Z

From the Wall Street Journal: SYDNEY— Daniel Parfitt thought he’d found the perfect drone for a two-day mapping job in a remote patch of the… From the Wall Street Journal: SYDNEY— Daniel Parfitt thought he’d found the perfect drone for a two-day mapping job in a remote patch of the Australian Outback. The roughly $80,000 machine had a wingspan of 7 feet and resembled a stealth bomber. There was just one problem. His machine raised the hackles of one prominent local resident: a wedge-tailed eagle. Wedge-tailed eagle Swooping down from above, the eagle used its talons to punch a hole in the carbon fiber and Kevlar fuselage of Mr. Parfitt’s drone, which lost control and plummeted to the ground. “I had 15 minutes to go on my last flight on my last day, and one of these wedge-tailed eagles just dive-bombed the drone and punched it out of the sky,” said Mr. Parfitt, who believed the drone was too big for a bird to damage. “It ended up being a pile of splinters.” Weighing up to nine pounds with a wingspan that can approach eight feet, the wedge-tailed eagle is Australia’s largest bird of prey. Once vilified for killing sheep and targeted by bounty hunters, it is now legally protected. Though a subspecies is still endangered in Tasmania, it is again dominating the skies across much of the continent. These highly territorial raptors, which eat kangaroos, have no interest in yielding their apex-predator status to the increasing number of drones flying around the bush. They’ve even been known to harass the occasional human in a hang glider. A picture of a wedge-tailed eagle taken by an Australian UAV drone. PHOTO: AUSTRALIAN UAV Birds all over the world have attacked drones, but the wedge-tailed eagle is particularly eager to engage in dogfights, operators say. Some try to evade these avian enemies by sending their drones into loops or steep climbs, or just mashing the throttle to outrun them. A long-term solution remains up in the air. Camouflage techniques, like putting fake eyes on the drones, don’t appear to be fully effective, and some pilots have even considered arming drones with pepper spray or noise devices to ward off eagles. They are the “ultimate angry birds,” said James Rennie, who started a drone-mapping and inspection business in Melbourne called Australian UAV. He figures that 20% of drone flights in rural areas get attacked by the eagles. On one occasion, he was forced to evade nine birds all gunning for his machine. The birds are considered bigger bullies than their more-docile relatives, such as the bald and golden eagles in the U.S. Wedge-tailed eagles are the undisputed alpha birds in parts of Australia’s interior but it’s not entirely clear why they’re so unusually aggressive towards drones. Scientists say they go after drones probably because they view them as potential prey or a new competitor. “They’re really the kings of the air in Australia,” said Todd Katzner, a biologist and eagle expert at the U.S. Geological Survey in Boise, Idaho. “There’s nothing out there that can compete with them.” Nick Baranov holds a drone camouflaged with ‘eagle-eyes.’ PHOTO: AUSTRALIAN UAV The problem is growing more acute as Australia makes a push to become a hot spot for drones. One state, Queensland, recently hosted the “World of Drones Congress” and last year gave about $780,000 to Boeing Co. for drone testing. Amazon.com is expanding in Australia and could try using drones for deliveries, and the machines are increasingly favored by big landowners such as miners and cattle ranchers. The eagles will often attack in male-female pairs, and they aren’t always deterred if their first foray fails. Sometimes they will come from behind, attack in tandem from above, or even stagger their assault. A drone operator may evade one diving eagle with an upward climb, but the second eagle can then snatch it,[...]



Our DIY Robocars sister community in the news

2017-09-26T14:02:20.000Z

width="1440" height="697" src="https://www.youtube.com/embed/i-x_-L4uBpc?wmode=opaque" frameborder="0" allowfullscreen=""> Jalopnik covers our DIY Robocars sister community. If you're in the Bay Area or one of the other half-dozen areas with these races, join us!  We'll also be competing in the Sparkfun AVC in Denver in October,

width="1440" height="697" src="https://www.youtube.com/embed/i-x_-L4uBpc?wmode=opaque" frameborder="0" allowfullscreen=""> Jalopnik covers our DIY Robocars sister community. If you're in the Bay Area or one of the other half-dozen areas with these races, join us!  We'll also be competing in the Sparkfun AVC in Denver in October,




Dronecode announces new Dronecore SDK, shipping with new Yuneec H520

2017-09-10T20:12:10.000Z

(image) At this weels Interdrone conference Yuneec and Dronecode announced the new DroneCore SDK, which is now shipping on the new Dronecode-based Yuneec H520 commercial…

(image) At this weels Interdrone conference Yuneec and Dronecode announced the new DroneCore SDK, which is now shipping on the new Dronecode-based Yuneec H520 commercial hexacopter. The above slide shows how the architecture works, but basically Dronecore replaces the old DroneKit SDK, and provides an easy-to use mobile (Android and iOS) and onboard (C++ and Python) interface to Dronecode/PX4-based vehicles. Of which there are many!

(image)

The library provides a simple core API for managing one or more vehicles, providing programmatic access to vehicle information and telemetry, and control over missions, movement and other operations.

Developers can extend the library using plugins in order to add any other required MAVLink API (for example, to integrate PX4 with custom cameras, gimbals, or other hardware over MAVLink).

DroneCore can run on a vehicle-based companion computer or on a ground-based GCS or mobile device. These devices have significantly more processing power that an ordinary flight controller, enabling tasks like computer vision, obstacle avoidance, and route planning.

The full reference is here.




Boeing's Insitu uses swarms of Solos for autonomous mapping

2017-08-21T16:30:00.000Z

allowfullscreen="" frameborder="0" height="720" src="https://www.youtube.com/embed/-pwDkxeWq14?wmode=opaque" width="1280">

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in…

width="1280" height="720" src="https://www.youtube.com/embed/-pwDkxeWq14?wmode=opaque" frameborder="0" allowfullscreen="">

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in compliance with FAA regulations, an individual pilot for each aircraft was on standby should immediate human intervention be required. The long-term goal is to make drones more efficient through automation, requiring less direct human input to gather data more quickly than would be possible with a single drone. Such a control system would also allow the drones to be operated remotely via the Internet or other networks.




Boeing's Insitu uses swarms of Solos for autonomous mapping

2017-08-21T16:30:00.000Z

allowfullscreen="" frameborder="0" height="720" src="https://www.youtube.com/embed/-pwDkxeWq14?wmode=opaque" width="1280">

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in…

width="1280" height="720" src="https://www.youtube.com/embed/-pwDkxeWq14?wmode=opaque" frameborder="0" allowfullscreen="">

In this episode, the Roswell Flight Test Crew speaks with John Leipper, the Solutions Architecture Manager for drone manufacturer Insitu. At the Future Farm Drone Rodeo in Pendleton, Oregon, Insitu conducted a drone swarm demonstration using three 3DR Solos – all controlled by a single pilot using a computer. Of course, to stay in compliance with FAA regulations, an individual pilot for each aircraft was on standby should immediate human intervention be required. The long-term goal is to make drones more efficient through automation, requiring less direct human input to gather data more quickly than would be possible with a single drone. Such a control system would also allow the drones to be operated remotely via the Internet or other networks.




Introducing OpenSolo!

2017-08-03T17:00:00.000Z

Big news! Reposting from the 3DR blog. Also see the ArduPilot Team announcement here.  When we launched Solo back in 2015, one of its selling points was that it was… Big news! Reposting from the 3DR blog. Also see the ArduPilot Team announcement here.  When we launched Solo back in 2015, one of its selling points was that it was based on the open source ArduPilot software, the project that Jordi Munoz and I launched as a side-project way back in 2007 and then grew beyond our imagination in the able hands of the community.  The point of Solo was to package up this open stack in a polished, easy-to-use consumer product (like the DJI Phantom), treating the ArduPilot stack as an “open core” and extending its functionality with proprietary features much as companies do with Linux-based devices. This worked very well as a product (Solo had some really innovative features, some of which are still unequaled) but less well as a business (we couldn’t make it cheaply enough to keep up with the rapid price declines in the consumer market, so we stopped making them at the end of 2015).  Now, two years later, 3DR has shifted its focus to the commercial market that exploded after the FAA launched its Part 107 commercial operator licensing program last year. But there are lots of Solos still out there, with great untapped potential — it’s just not our core business anymore. So what to do? Open source the rest of it! We’ve heard loud and clear that the community wants a tried-and-true Ardupilot platform that can be extended without limit. The Ardupilot team has already embraced Solo and ported the latest flight code to it. But the custom 3DR WiFi control, telemetry, and video streaming technology, the “Artoo” controller and the “Shot Manager” mission control stack that runs on the onboard Linux processor were not open source, so the full potential of the drone remained locked. No more. I’m delighted to announce that we’re now open sourcing almost all of the remaining code, including the SoloLink wireless stack, ShotManager, the high-level onboard mission scripting layer that gave Solo all of its “smart shots”, and a range of other packages include the code for the controller and the build tools. The code has now been released in a new OpenSolo organization on Github, licenced under the permissive Apache 2.0 licence. More details about what’s been released here: solo-builder – scripts for configuring a virtual machine to build the Solo software meta-3dr – the build recipes that assemble the complete Linux system for the Solo and Controller i.MX6 processors. shotmanager – implementation of Solo’s Smart Shots. sololink – 3DR software that runs on the i.MX6 processors, implementing things like video streaming, control, telemetry, pairing, logging, etc. artoo – firmware for the STM32 microcontroller in the controller responsible for the inputs and screen. solo-gimbal (coming soon) – firmware for the microcontrollers in the Solo Gimbal [...]



Dronecode updates for July: new QGroundControl, FPGA autopilot, PX4 1.6

2017-07-23T01:30:00.000Z

allowfullscreen="" frameborder="0" height="360" mozallowfullscreen="" src="https://player.vimeo.com/video/226583727" webkitallowfullscreen="" width="640"> Lots of news and updates from the Dronecode team this month:  1) The Aerotenna OcPoC autopilot (video above) now supports the Dronecode/PX4 software stack! ● FPGA and dual-core ARM processors in… width="640" height="360" src="https://player.vimeo.com/video/226583727" frameborder="0" webkitallowfullscreen="" mozallowfullscreen="" allowfullscreen=""> Lots of news and updates from the Dronecode team this month:  1) The Aerotenna OcPoC autopilot (video above) now supports the Dronecode/PX4 software stack! ● FPGA and dual-core ARM processors in OcPoC allow for real-time signal processing and for executing complicated algorithms, enabling exciting new possibilities for artificial intelligence, deep learning, and a truly autonomous and intelligent UAV ● With more than 30 programmable I/Os supporting most standard interfaces, OcPoC is incredibly flexible, allowing free reign for your creativity ● OcPoC features industrial-grade redundancy, ensuring you can always count on your key systems such as GPS, IMU, and more ● Flawless integration with Aerotenna microwave radar sensors, including uLanding radar altimeter and uSharp collision-avoidance sensor. 2) QGroundControl 3.2 is out! Many inprovements and new features: Settings File Save path - Specify a save path for all files used by QGC. Telemetry log auto-save - Telemetry logs are now automatically saved without prompting. AutoLoad Plans - Used to automatically load a Plan onto a vehicle when it first connects. RTK GPS - Specify the Survey in accuracy and Minimum observation duration. Setup ArduPilot only Pre-Flight Barometer and Airspeed calibration - Now supported Copy RC Trims - Now supported Plan View Plan files - Missions are now saved as .plan files which include the mission, geo-fence and rally points. Plan Toolbar - New toolbar which shows you mission statistics and Upload button. Mission Start - Allows you to specify values such as flight speed and camera settings to start the mission with. New Waypoint features - Adjust heading and flight speed for each waypoint as well as camera settings. Visual Gimbal direction - Gimbal direction is shown on waypoint indicators. Pattern tool - Allows you to add complex patterns to a mission. Fixed Wing Landing (new) Survey (many new features) Fixed Wing Landing Pattern - Adds a landing pattern for fixed wings to your mission. Survey - New features Take Images in Turnarounds - Specify whether to take images through entire survey or just within each transect segment. Hover and Capture - Stop vehicle at each image location and take photo. Refly at 90 degree offset - Add additional pattern at 90 degree offset to original so get better image coverage. Entry location - Specify entry point for survey. Polygon editing - Simple on screen mechanism to drag, resize, add/remove points. Much better touch support. Fly View Arm/Disarm - Available from toolbar. Guided Actions - New action toolbar on the left. Supports: Takeoff Land RTL Pause Start Mission Resume Mission - after battery change Change Altitude Land Abort Set Waypoint Goto Location Remove mission after vehicle lands - Prompt to remove mission from vehicle after landing. Flight Time - Flight time is shown in instrument panel. Multi-Vehicle View - Better control of multiple vehicles. Analyze View - New Log Download - Moved to Analyze view from menu Mavlink Console - NSH shell access Support for third-party customized QGroundControl Standard QGC supports multiple firmware types and multiple vehicle types. There is now support in QGC which allows a thi[...]



DIY Drones at 85,000 members -- a look back at an extraordinary decade

2017-07-07T01:00:00.000Z

(image)

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 85,000 members! We're also more than ten years old!

Rather than simply give the usual monthly traffic snapshot, I thought I'd give the data…

(image)

It's customary and traditional that we celebrate the addition of every 1,000 new members here and share the traffic stats. We've now passed 85,000 members! We're also more than ten years old!

Rather than simply give the usual monthly traffic snapshot, I thought I'd give the data for the whole decade, which tells quite a story. 

  • First, some amazing totals:
    • More than 20 million users and 117 million pageviews over the decade. 
    • 13,400 blog posts
    • More than 60,000 discussion threads
    • Nearly a million comments
  • Second, the ups and downs of this industry. Over the ten years, we've gone from one of the few drone communities around to today, when there are hundreds of sites, most of them commercial, and drone users and developers are scattered amongst them. In the early 2010s, DIY Drones was in the top three results on Google for "drones". Now there are pages and pages of commercial sites before it. That's a natural thing and demonstrates classic maturing of an industry. The amateurs have given way to the pros.
  • Third, the related rise and fall of "DIY" in the drone industry. With the triumph of DJI and its Phantom (and now Mavic and Spark) lines, it's no longer necessary to build your own drone. This is a good thing (the same happened with PCs and all sorts of electronics before it), and many people still choose to do so anyway for fun (as they still do with PCs), but it's clearly gone back to a niche activity or one for developers, much as it was in the early days. 

Today, we're still a big community with healthy traffic (about 20,000 visitors and 35,000 page views a day). And we'll continue just as we are for many years to come. We won't be the biggest site in this space, but we'll continue to be one of the most interesting and a friendly, high-quality place to talk about ideas and projects that extend of potential of drones to change the world. And have fun doing it!




ArduPilot, PX4 dominate AUVSI drone competition

2017-06-26T17:00:00.000Z

(image) When we got started ten years ago, the annual AUVSI student drone competition was dominated by commercial autopilots, such as Piccolo. Now it's almost entirely open source autopilots, led by ArduPilot (14 of top 20) and Dronecode/PX4 (3 of top twenty). I'm super proud of this having co-founded…

(image) When we got started ten years ago, the annual AUVSI student drone competition was dominated by commercial autopilots, such as Piccolo. Now it's almost entirely open source autopilots, led by ArduPilot (14 of top 20) and Dronecode/PX4 (3 of top twenty). I'm super proud of this having co-founded ArduPilot and now leading Dronecode. Only one commercial autopilot in top twenty -- next year they will be gone entirely!

From sUAS News

(image)




How do modern open source autopilots compare to aerospace-grade IMUs?

2017-06-20T17:21:08.000Z

I noticed that Digikey is now selling Honeywell's newest aerospace-grade IMUs, which cost $1,328 each (note that's just for the IMU; it's not… I noticed that Digikey is now selling Honeywell's newest aerospace-grade IMUs, which cost $1,328 each (note that's just for the IMU; it's not a full autopilot). How do the specs of these aerospace IMUs compare to those we use here? Are they worth the extra money?  In terms of overall approach, the Honeywell IMU seem very similar to modern autopilots such as Pixhawk 2.x and 3.3: they both have MEMS sensors with internal environmental isolation and temperature compensation. As for the sensors themselves, I'm no expert on specs, so I'll just post the basics here, comparing the Honeywell sensor to the Pixhawk 3.  On the face of it, the Invensense and ST sensors in the Pixhawk 3 appear at least as good, if not better. But I imagine that there are some other factors that may be more important, such as gyro drift and vibration filtering. The Honeywell specs in drift are shown here:  Meanwhile the Invensense ICM-20602 sensor in the Pixhawk 3 gives its drift in different units: ±4mdps/√Hz. I really don't know how to compare those. Finally, I'm sure that a lot of the performance depends on the software running on the Pixhawk boards, be it PX4 or APM, both of which use GPS to augment the raw IMU data to compensate for drift, along with a lot of other smart filtering.  So for those IMU experts out there: how do you think these two approaches compare? Are aerospace-grade IMUs worth the extra money? [...]



Intel cancels Edison, Joule boards

2017-06-19T12:50:50.000Z

It was well known that Edison was going to be discontinued this year, but Joule, which was just released, is a surprise. This is bad news for any autopilot board that uses Edison, such as Pixhawk 2.1, which will now have to move to another companion computer. (I'd suggest Raspberry Pi). From … It was well known that Edison was going to be discontinued this year, but Joule, which was just released, is a surprise. This is bad news for any autopilot board that uses Edison, such as Pixhawk 2.1, which will now have to move to another companion computer. (I'd suggest Raspberry Pi). From Hackaday: Sometimes the end of a product’s production run is surrounded by publicity, a mix of a party atmosphere celebrating its impact either good or bad, and perhaps a tinge of regret at its passing. Think of the last rear-engined Volkswagens rolling off their South American production lines for an example. Then again, there are the products that die with a whimper, their passing marked only by a barely visible press release in an obscure corner of the Internet. Such as this week’s discontinuances from Intel, in a series of PDFs lodged on a document management server announcing the end of their Galileo (PDF), Joule (PDF), and Edison(PDF) lines. The documents in turn set out a timetable for each of the boards, for now they are still available but the last will have shipped by the end of 2017. It’s important to remember that this does not mark the end of the semiconductor giant’s forray into the world of IoT development boards, there is no announcement of the demise of their Curie chip, as found in the Arduino 101. But it does mark an ignominious end to their efforts over the past few years in bringing the full power of their x86 platforms to this particular market, the Curie is an extremely limited device in comparison to those being discontinued. Will the departure of these products affect our community, other than those who have already invested in them? It’s true to say that they haven’t made the impression Intel might have hoped, over the years only a sprinkling of projects featuring them have come our way compared to the flood featuring an Arduino or a Raspberry Pi. They do seem to have found a niche though where there is a necessity for raw computing power rather than a simple microcontroller, so perhaps some of the legion of similarly powerful ARM boards will plug that gap. So where did Intel get it wrong, how did what were on the face of it such promising products fizzle out in such a disappointing manner? Was the software support not up to scratch, were they too difficult to code for, or were they simply not competitively priced in a world of dirt-cheap boards from China?  [...]



NASA webinar on drone crash-avoidance technology using APM copters

2017-06-16T22:33:33.000Z

allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/kM_FFWgPV_Q?wmode=opaque" width="560">

Is it flattering that NASA uses a 3DR Y6 to teach "crash management" techniques? I'm going with yes! Register here

ASA’s Langley Research Center is offering a free informational webinar on its  autonomous…

width="560" height="315" src="https://www.youtube.com/embed/kM_FFWgPV_Q?wmode=opaque" frameborder="0" allowfullscreen="">

Is it flattering that NASA uses a 3DR Y6 to teach "crash management" techniques? I'm going with yes! Register here

ASA’s Langley Research Center is offering a free informational webinar on its  autonomous crash management system for small UAVs that enables landing a malfunctioning unit to a safe and clear ditch site. The webinar will take place on July 25th @ 2PM (EDT).

The mission of the system, called Safe2Ditch, is emergency management to get the vehicle safely to the ground in the event of an unexpected critical flight issue. For example, a drone delivery flight that loses battery power before reaching desitnation.

Safe2Ditch uses intelligent algorithms, knowledge of the local area, the remaining control authority and battery life to select the safest landing location for a crippled UAV and steer it to the ground. The system helps minimize the risk of UAVs to people and property. This mission is performed autonomously, without any assistance from a safety pilot or ground station and all while residing on a small processor onboard.

During this free webinar, lead inventors Patricia Glaab and Louis Glaab will discuss this technology and its potential uses, followed by an open Q&A session.




Nvidia demos visual navigation on 3DR Iris+

2017-06-13T13:30:00.000Z

allowfullscreen="" frameborder="0" height="506" src="https://www.youtube.com/embed/4_TmPA-qw9U?wmode=opaque" width="900"> From Nvidia: Here's the full paper. Most drones would be lost without GPS. Not this one. A drone developed by NVIDIA researchers navigates even the most far-flung,… width="900" height="506" src="https://www.youtube.com/embed/4_TmPA-qw9U?wmode=opaque" frameborder="0" allowfullscreen=""> From Nvidia: Here's the full paper. Most drones would be lost without GPS. Not this one. A drone developed by NVIDIA researchers navigates even the most far-flung, unmapped places using only deep learning and computer vision powered by NVIDIA Jetson TX1 embedded AI supercomputers. Although initially designed to follow forest trails to rescue lost hikers or spot fallen trees, the low-flying autonomous drone could work far beyond the forest — in canyons between skyscrapers or inside buildings, for example — where GPS is inaccurate or unavailable. “This works when GPS doesn’t,” said Nikolai Smolyanskiy, the NVIDIA team’s technical lead. “All you need is a path the drone can recognize visually.” Researchers built their drone with off-the-shelf components to reduce costs. No GPS? No Problem Although the technology is still experimental, it could eventually search for survivors in damaged buildings, inspect railroad tracks in tunnels, check stock on store shelves, or adapted to examine communications cables underwater, Smolyanskiy said. The team’s already trained it to follow train tracks and ported the system to a robot-on-wheels to traverse hallways. The drone also avoids obstacles like people, pets or poles. “We chose forests as a proving ground because they’re possibly the most difficult places to navigate,” he said. “We figured if we could use deep learning to navigate in that environment, we could navigate anywhere.” Unlike a more urban environment, where there’s generally uniformity to, for example, the height of curbs, shape of mailboxes and width of sidewalks, the forest is relatively chaotic. Trails in the woods often contain no markings. Light can be filtered through leaves; it also varies from bright sunlight to dark shadows. And trees vary in height, width, angle and branches. Flight Record To keep costs low, the researchers built their device using an off-the-shelf drone equipped with the NVIDIA Jetson TX1 and two cameras. “Our whole idea is to use cameras to understand and navigate the environment,” Smolyanskiy said. “Jetson gives us the computing power to do advanced AI onboard the drone, which is a requirement for operating in remote environments.” The NVIDIA team isn’t the first to pursue a drone that navigates without GPS, but the researchers achieved what they believe is the longest and most stable flight of its kind. Their fully autonomous drone flies along the trail for a kilometer (about six-tenths of a mile), avoiding obstacles and maintaining a steady position in the center of the trail. Team member Alexey Kamenev played a big role in making this happen. He developed deep learning techniques that allowed the drone to smoothly fly along trails without sudden movements that would make it wobble. He also reduced the need for massive amounts of data typically needed to train a deep learning system. In the video above, the drone follows a trail in the forest near the researchers’ Redmond, Wash., office. The areas in green are where the robot decided to fly and the red areas are those it rejected. No Breadcrumbs Needed The drone learned to find its way by watching video that[...]



New two-motor VTOL from Horizon

2017-06-11T04:24:33.000Z

allowfullscreen="" frameborder="0" height="503" src="https://www.youtube.com/embed/wFXibbgNix8?wmode=opaque" width="878"> This kind of 2-motor vertical take-off plane was a PhD thesis 2 years ago, a TED talk 1 year ago & now it's a $150 toy. From… width="878" height="503" src="https://www.youtube.com/embed/wFXibbgNix8?wmode=opaque" frameborder="0" allowfullscreen=""> This kind of 2-motor vertical take-off plane was a PhD thesis 2 years ago, a TED talk 1 year ago & now it's a $150 toy. From Horizon Hobby: Key Features Multirotor versatility and sport plane agility Takes off and lands vertically in small areas Fly slow or fast and perform aerobatics in airplane mode Can be hand launched and belly-landed like a conventional wing Simple tail-sitter design and SAFE technology makes VTOL flying easy Stability and Acro modes that provide a wide range of flight performance Optional and patent-pending FPV camera and servo-driven mechanism (sold separately) 280-size brushless motors compatible with 2S 450-800mAh LiPo batteries Outstanding speed and climb performance Lightweight and extremely durable EPO airframe Colorful decal sheet with multiple trim scheme options Ready to fly within minutes of opening the box Propeller guards and vertical fins that are easy to install or remove Needed to Complete Full-range, 6+ Channel DSMX®/DSM2® transmitter 450-800mAh 2S LiPo flight battery 2S compatible LiPo charger What's in the box? (1) X-VERT VTOL Airplane (1) 3-in-1 Receiver/ESC/Flight Controller Unit (2) BL280 2600Kv Brushless Outrunner Motor (4) Decal Sheets (1) User Manual Overview The X-VERT™ VTOL gives you all the fun and versatility of a Vertical Take Off and Landing aircraft without the need for complex mechanics or fancy programming. It also makes the transition between multirotor and airplane flight as easy as flipping a switch. You can also take your flight experience to a whole different level using the optional and patent-pending FPV camera and servo-driven mechanism that transition automatically when the X-VERT does (FPV gear sold separately). Sleek and Simple Design A lot of VTOL aircraft require complex mechanisms like tilting wings and motors to achieve vertical and forward flight. The X-VERT park flyer's simple, tail-sitter design and SAFE® stabilization technology allow it to fly like an airplane or a multirotor using nothing more than differential thrust and its elevons. The simplicity of this design also makes the lightweight, EPO airframe remarkably durable. Wide, Pilot-Friendly Flight Envelope The light wing loading and efficient aerodynamics inherent in the aircraft's design play a big role in making it easy to fly, especially in airplane mode. Fast or slow, pilots will enjoy smooth, predictable response at any speed. SAFE® Flight Control Software Makes it Easy At the heart of it all is exclusive SAFE (Sensor Assisted Flight Envelope) flight control software that has been expertly tuned so almost any RC pilot can experience the fun of VTOL flight. Automated Transition Making the transition between multirotor and airplane flight is as simple as flipping a switch. The flight controller will automatically transition the aircraft from one to the other using SAFE technology to stabilize everything so you can relax and have fun. 3 Flight Modes The advanced flight control software features three flight modes that, along with the model's light wing loading and efficient aerodynamics, give you a wide range of performance. -   Multirotor Stability Mode This mode allows you to take off and land v[...]



Dronecode/PX4 1.6 code released!

2017-06-07T19:52:19.000Z

From the Dronecode release post: We’re very excited to announce the release of PX4 v1.6, the… From the Dronecode release post: We’re very excited to announce the release of PX4 v1.6, the latest version of the Dronecode Flight Stack (PX4 Pro). This firmware represents a huge increase in both usability, functionality, and stability/robustness since our last significant delivery back in August 2016 (PX4 v1.5.0). Just a few of the new features and enhancements in this release are: New flight modes for Fixed Wing – Acro and Rattitude New uLog logging format that directly logs uORB topics for more accurate time stamping. This is already supported for review and analysis here: http://review.px4.io Improvements to camera triggering to make it easier to use and and provide better real-time feedback Support for survey flights in multicopter and fixed wing with an intuitive UI Temperature calibration and compensation Support for MAVLink and PWM controlled gimbals Support for generic helicopters and Blade 130 mixer Improved robustness in EKA2 and hardening against marginal GPS reception. Significant improvements to user experience for both the Qualcomm Snapdragon Flight and the Intel® Aero Ready to Fly Drone Support for STM32F7 and NuttX Update to a recent release New hardware support including the Crazyflie v2, FMUv4 PRO and FMUv5 (Special thanks to Drotek and Team Blacksheep for donating the FMUv4 and FMUv5 hardware!) This is also the most tested and hardened PX4 release to date. The dedicated test team that have done hundreds of hours of testing, on all the major vehicle platforms and using all the main reference flight controller hardware. A breakdown of the testing since the last stable release (1.5.5) is listed below: 2257 commits tested. 847 total flights on 12 different vehicles and 6 different flight controllers: Pixhawk mini (DJI F450): 554 Pixhawk mini (Generic Quad): 11 Pixhawk 1 (DJI F450): 15 Pixhawk mini (Hexa): 11 Pixhawk mini (Phantom FW): 17 Pixhawk mini (QAV 250): 28 Pixracer (DJI F450): 34 Pixracer (Flipsport): 140 Pixhawk 3 Pro (DJI F450): 27 Dropix (VTOL): 1 Intel® Aero Ready to Fly Drone: 6 Snapdragon (200qx): 3 6 releases tested: 1.6.0-rc1, 1.6.0-rc2, 1.6.0-rc3, 1.6.0-rc4, 1.6.0, 1.6.1 22 PR’s tested: 6362, 6438, 6440, 6505, 6633, 6756, 6777, 6862, 6863, 6920, 7003, 7009, 7017, 7036, 7095, 7260, 7265, 7268, 7274, 7281, 7287, 7346 The firmware is already available in QGroundControl (for access to the best UI you may choose to use the “daily build” here). We owe a huge debt of gratitude to the whole PX4 Development Team for this outstanding work. Check out the release notes for more information [...]



Updating the "red balloon finder" to use OpenMV cam

2017-06-05T20:00:00.000Z

Awesome post from Patrick Poirier in the OpenMV forums (OpenMV is my favorite computer vision board, and what I use on our DIY Robocars autonomous… Awesome post from Patrick Poirier in the OpenMV forums (OpenMV is my favorite computer vision board, and what I use on our DIY Robocars autonomous racers). -------- This project is a variation of the original Rand'ys Red Balloon Finder implementation. Based on this blog : http://diydrones.com/profiles/blogs/red-balloon-finder , I modified the Python scripts making now possible to test on any ArduPilot based Quadcopter on a low budget , relatively easy to implement controller. width="1280" height="720" src="https://www.youtube.com/embed/a-rwGPdBSp8?wmode=opaque" frameborder="0" allowfullscreen=""> Code: Select all > #This is the configuration script, allowing usage the configuration file > import balloon_config > #We need these modules (installed with dronekit ) to control vehicle> > from pymavlink import mavutil > from dronekit import connect, VehicleMode, LocationGlobal > # connect to vehicle with dronekit > #MAIN > # only process images once home has been initialised > if self.check_home(): > # check if we are controlling the vehicle > self.check_status() > # look for balloon in image > self.analyze_image() > # search or move towards balloon > if self.search_state > 0: > # search for balloon > self.search_for_balloon() > else: > # move towards balloon > self.move_to_balloon()# move towards balloon > # move_to_balloon - velocity controller to drive vehicle to balloon > # calculate change in yaw since we began the search > # get speed towards balloon based on balloon distance > # apply min and max speed limit > # apply acceleration limit > # calculate yaw correction and final yaw movement > # calculate pitch correction and final pitch movemen > # calculate velocity vector we wish to move in > # send velocity vector to flight controller > send_nav_velocity(pitch_final, yaw_final, speed) > # complete - balloon strategy has somehow completed so return control to the autopilot > # stop the vehicle and give up control > # if in GUIDED mode switch back to LOITER OpenMV Script The Red Balloon Finder is a typical colored BLOB Detector/Tracker.  We are adding a serial output to PORT 3 so the x-y location and blob width & height can be transmitted to the RPI Zero. uart_baudrate = 9600 uart = pyb.UART(3, uart_baudrate, timeout_char = 1000) uart.write("%d ; %d ; %d ; %d \r\n " % (blob.cx(), blob.cy(),blob.w(),blob.h())) Some theory In vision based systems, there are many types of hardware/software configuration tailored for sp[/img]ecific applications: Visual Servoing, Visual Odometry and Visual Simultaneous Localization And Mapping (SLAM). In this project we are using the former type of system: Visual Servoing that is designed to: • Take off and Landing • Obstacle Avoidance/Tracking • Position and Attitude control • Stabilization over a target The main idea of Visual Servoing is to regulate t[...]



Test stats for next Dronecode/PX4 code release

2017-05-29T19:09:07.000Z

(image)

This week the Dronecode/PX4 team will release its biggest and best update, version 1.6. You can see a glimpse of the testing that went into it with our public test log server here

Along with all the automated code and flight…

(image)

This week the Dronecode/PX4 team will release its biggest and best update, version 1.6. You can see a glimpse of the testing that went into it with our public test log server here

Along with all the automated code and flight simulator testing, there are more than 100 hours of real-world flight testing (above and beyond many times that many hours by the beta testers) by the full-time Dronecode Test Team, as show here.

(image)  




Comparing two low-cost scanning Lidars

2017-05-29T00:00:00.000Z

Excerpted from a new post… Excerpted from a new post at our DIY Robocars sister site:  It’s now possible to buy small scanning 2D Lidars for less than $400 these days, which is pretty amazing, since they were as much as $30,000 a few years ago. But how good are they for small autonomous vehicles? I put two to the test: the RP Lidar A2 (left above) and the Scanse Sweep (right). The RP Lidar A2 is the second lidar from Slamtec, a Chinese company with a good track record. Sweep is the first lidar from Scanse, a US company, and was a Kickstarter project based on the Lidar-Lite 3 1D laser range finder unit that was also a Kickstarter project a few years ago (I was an adviser for that) and is now part of Garmin. The good news is that both work. But in practice, the difference between them become very stark, with the biggest being the four times higher resolution of the RP Lidar A2 (4,000 points per second, versus Sweep’s 1,000), which makes it actually useful outdoors in a way that Sweep is not. Read on for the details. First, here are the basic spec comparisons: Bottom line: RP Lidar A2 is smaller, much higher resolution, and better range indoors (it’s notable that the real-world RP Lidar performance was above the stated specs, while the Scanse performance was below its stated specs). The Scanse desktop visualization software is better, with lots of cool options such as line detection and point grouping, but in practice you won’t use it since you’ll just be reading the data via Python in your own code. Sadly the Scanse code that does those cool things does not appear to be exposed as libraries or APIs that you can use yourself. In short, I recommend the RP Lidar A2. I tested them both in small autonomous cars, as shown below (RP Lidar at left) Both have desktop apps that allow you to visualize the data. Here’s a video of the the two head-to-head scanning the same room (RP Lidar is the window on the right) width="570" height="321" src="https://www.youtube.com/embed/0FsXCD9jQHw?wmode=opaque" frameborder="0" allowfullscreen=""> You can see difference in resolution pretty clearly in that video: the RP Lidar just has four times as many points, and thus four times higher angular resolution. That means it can not only see smaller objects at a distance, but the objects it does see have four times as many data points, making it much easier to differentiate them from background noise. As far as using them with our RaspberryPi autonomous car software, it’s a pretty straightforward process of plugging them into the RaspberryPi via the USB port (the RP Lidar should be powered separately, see the notes below) and reading the data with Python.  My code for doing this is in my Github repository here.  We haven’t decided how best to integrate this data with our computer vision and neural network code, but we’re working on that now — watch this space Read the rest here [...]



Using voice to command a drone by translating any GPS location to three unique words

2017-05-25T14:27:29.000Z

(image) An Australian IT services company combined an Amazon Alexa and a service called "what3words" that translates GPS addresses into three unique words to create a MAVLink-compatible drone that can be commanded entirely by voice. The video can't be embedded, so see it …

(image) An Australian IT services company combined an Amazon Alexa and a service called "what3words" that translates GPS addresses into three unique words to create a MAVLink-compatible drone that can be commanded entirely by voice. The video can't be embedded, so see it here

DXC Labs has created an experimental voice-activated (Amazon Alexa) and cloud-controlled (AWS IoT) drone that uses three-word identifiers from what3words to provide precise location directions to within three square meters, anywhere in the world. This means the drone operator (such as a first responder or maintenance worker) can easily give a voice command such as “go to location: public warns artist” that will send the drone to a specific place on earth — in this case the historic shipwreck of the HMVS Cerberus, south of Melbourne, Australia. 

(image)

Getting to Hard-to-Reach Places

 
The what3words service allows user-friendly routing of the computer-controlled drone to locations that may not have a conventional street address, such as plant and equipment locations, a missing person in a national park, a fire in a large campus, or any location whose street address is inaccurate or ambiguous. 

This greatly improves activities such as identifying disaster zone locations for first responders, inspecting power lines and oil rigs, making deliveries to hard-to-reach places, and traveling to any point in the world.




Lovely story about a poor kid in rural China who sells exquisite models of Chinese planes for almost no profit

2017-05-24T14:22:47.000Z

These aren't drones per se, but I can help but post this story by Owen Churchill in Sixthtone, which exemplifies the DIY spirit.  CHONGQING, Southwest China —… These aren't drones per se, but I can help but post this story by Owen Churchill in Sixthtone, which exemplifies the DIY spirit.  CHONGQING, Southwest China — Dismembered remote-controlled airplanes lie strewn across an unmade mattress: a motor glued crudely into an improvised wooden housing, landing gear fashioned out of a metal ruler, and sheets of foam advertising board that will become wings, fuselages, and flaps. This is the spare room of Hu Bo, an 18-year-old from a village two hours from Chongqing who has turned his hand to making flyable models of China’s home-grown aircraft. He started with decades-old military planes and has recently worked on the new flagbearer of the country’s civil aviation industry, the C919 passenger jet. Using self-taught techniques, open-source plans downloaded from the internet, and cheaply acquired or improvised parts, Hu sells his finished models online — unless he crashes them during the test flight. “My technique isn’t so good,” he tells Sixth Tone with a smile as he tinkers with his latest build, a 1.4-meter-wide, twin-propeller plane modeled on China’s Y-12 utility aircraft. Like millions of other Chinese children, Hu grew up under the guardianship of his grandparents while his parents traveled in search of work, dividing his time between doing homework, playing with the family’s dog, and making intricate paper airplanes at the family home in Yangliu Village, a tiny hamlet around 100 kilometers west of Chongqing. But now, having scraped together money to buy some basic tools, Hu has joined the ranks of China’s rising number of amateur aviation enthusiasts, spurred on by a huge yet inconsistently regulated drone industry and inspired by the increasing prowess of the country’s home-grown fleet of both military and civilian aircraft. A number of fifth-generation fighter jets are slated to enter service in the next few years; the maiden flight of the world’s widest seaplane — the AG600 — is scheduled for this year; and the country’s first large passenger jet in decades, the C919, took to the skies for the first time on May 5, catalyzing the emergence of a new generation of patriotic plane spotters despite its plethora of foreign parts. For poor people like us, we have time but no money, so we have to make it ourselves.- Hu Bo, model plane enthusiast Hu has never been on a plane, nor has he ever purchased a complete remote-controlled plane. While he was inspired to start building planes after seeing local friends discussing the latest and best modeling equipment on social media, he has little respect for people who throw money at the hobby. “They are all renminbi fliers,” he says, referring to China’s monetary currency. “For poor people like us, we have time but no money, so we have to make it ourselves. For them, they have money but no time, so they just buy everything outright.” Buyers on Xianyu — the secondhand version of China’s premier online marketplace, Taobao — have already praised the caliber of Hu’s models, especially the C919. But at least for now, Hu has little interest in making a profit from his planes — if he believes the buyer is a genuine plane enthusiast who will cherish his mo[...]



Follow-me with dynamic obstacle avoidance

2017-05-18T20:16:01.000Z

allowfullscreen="" frameborder="0" height="347" src="https://www.youtube.com/embed/dh7yOEHBc_w?wmode=opaque" width="616">

From New Atlas, a writeup on new research from ETH and MIT:

Thanks to new research from MIT and ETH Zurich, however, it may soon be possible for drones to autonomously follow along with an actor, keeping their face framed "just right" the whole time –…

width="616" height="347" src="https://www.youtube.com/embed/dh7yOEHBc_w?wmode=opaque" frameborder="0" allowfullscreen="">

From New Atlas, a writeup on new research from ETH and MIT:

Thanks to new research from MIT and ETH Zurich, however, it may soon be possible for drones to autonomously follow along with an actor, keeping their face framed "just right" the whole time – while also avoiding hitting any obstacles.

To utilize the experimental new system, operators start by using a computer interface to indicate who the drone should be tracking, how much of the screen their face or body should occupy, where in the screen they should be, and how they should be oriented toward the camera (choices include straight on, profile, three-quarter view from either side, or over the shoulder).

Once the drone is set in action, the computer wirelessly sends it control signals that allow it to fly along with the actor as they walk, adjusting its flight in order to maintain the shot parameters. This means that if the actor were to start turning their back on the drone, for instance, it would automatically fly around in front of them, to keep their face in the shot. Likewise, if they started walking faster, the drone would also speed up in order to keep them the same distance from the camera.

It's additionally possible for the aircraft to follow small groups of actors, working to keep that group framed a certain way within the shot. The user can stipulate one of those actors as the main subject, ensuring that the drone moves in order to keep other actors from blocking the camera's view of them.

The system utilizes algorithms that predict the actor's trajectory about 50 times a second, allowing the aircraft to effectively stay one step ahead of the action. This also allows it to correct its own flight path if its onboard sensors detect that it's heading toward a stationary obstacle, or if a moving obstacle (such as an actor) is on a collision course with it.

A team led by MIT's Prof. Daniela Rus will be presenting a paper on the research later this month at the International Conference on Robotics and Automation. The system is demonstrated in the video below.

Source: MIT




Pixhawk-powered rover wins 2017 RoboMagellan competition

2017-05-11T23:26:33.000Z

width="400" height="300" src="https://www.youtube.com/embed/j0d6qT2OeDk?wmode=opaque" frameborder="0" allowfullscreen="">

Congrats to Marco Walter, who won the the 2017 RoboMagellan competition at RoboGames with a rover that used Pixhawk + Ardurover + Odroid U3

width="400" height="300" src="https://www.youtube.com/embed/j0d6qT2OeDk?wmode=opaque" frameborder="0" allowfullscreen="">

Congrats to Marco Walter, who won the the 2017 RoboMagellan competition at RoboGames with a rover that used Pixhawk + Ardurover + Odroid U3




Drone learns to fly by crashing (a lot)

2017-05-10T20:39:42.000Z

allowfullscreen="" frameborder="0" height="349" src="https://www.youtube.com/embed/HbHqC8HimoI?wmode=opaque" width="620"> Excerpt from the IEEE Spectrum article: “Learning to Fly by Crashing,” a paper from CMU roboticists Dhiraj Gandhi, Lerrel Pinto, and Abhinav Gupta,… width="620" height="349" src="https://www.youtube.com/embed/HbHqC8HimoI?wmode=opaque" frameborder="0" allowfullscreen=""> Excerpt from the IEEE Spectrum article: “Learning to Fly by Crashing,” a paper from CMU roboticists Dhiraj Gandhi, Lerrel Pinto, and Abhinav Gupta, has such a nice abstract that I’ll just let them explain what this research is all about: [T]he gap between simulation and real world remains large especially for perception problems. The reason most research avoids using large-scale real data is the fear of crashes! In this paper, we propose to bite the bullet and collect a dataset of crashes itself! We build a drone whose sole purpose is to crash into objects [. . .] We use all this negative flying data in conjunction with positive data sampled from the same trajectories to learn a simple yet powerful policy for UAV navigation. Cool, let’s get crashing! One way to think of flying (or driving or walking or any other form of motion) is that success is simply a continual failure to crash. From this perspective, the most effective way of learning how to fly is by getting a lot of experience crashing so that you know exactly what to avoid, and once you can reliably avoid crashing, you by definition know how to fly. Simple, right? We tend not to learn this way, however, because crashing has consequences that are usually quite bad for both robots and people.  The CMU roboticists wanted to see if there are any benefits to using the crash approach instead of the not crash approach, so they sucked it up and let an AR Drone 2.0 loose in 20 different indoor environments, racking up 11,500 collisions over the course of 40 hours of flying time. As the researchers point out, “since the hulls of the drone are cheap and easy to replace, the cost of catastrophic failure is negligible.” Each collision is random, with the drone starting at a random location in the space and then flying slowly forward until it runs into something. After it does, it goes back to its starting point, and chooses a new direction. Assuming it survives, of course. Once a collision happens, the images from the trajectory are split into two parts: the part where the drone was doing fine, and the part just before it crashes. These two sets of images are fed into a deep convolutional neural network, which uses them to learn whether a given camera image means that going straight is a good idea or not. After 11,500 collisions, the resulting algorithm is able to fly the drone autonomously, even in narrow, cluttered environments. During this process, the drone’s forward-facing camera is recording images at 30 Hz. Once a collision happens, the images from the trajectory are split into two parts: the part where the drone was doing fine, and the part just before it crashes. These two sets of images are fed into a deep convolutional neural network (with ImageNet-pretrained weights as initialization for the network), which uses them to learn, essentially, whether a given camera image means that going straight is a go[...]



Navy wins DARPA swarm challenge

2017-05-09T06:08:24.000Z

width="620" height="360" src="https://www.youtube.com/embed/igz2dmDLOZY?wmode=opaque" frameborder="0" allowfullscreen="">

Air Force came in second. Most seem to be using 3DR Pixhawks by the looks of the teaser video. Final video coming soon

http://www.uasvision.com/2017/05/09/usaf-academy-cadets-second-in-uas-competition/

width="620" height="360" src="https://www.youtube.com/embed/igz2dmDLOZY?wmode=opaque" frameborder="0" allowfullscreen="">

Air Force came in second. Most seem to be using 3DR Pixhawks by the looks of the teaser video. Final video coming soon

http://www.uasvision.com/2017/05/09/usaf-academy-cadets-second-in-uas-competition/




Review of the $50 Jevois computer vision camera/computer

2017-05-07T22:30:00.000Z

(image)

Full review at our sister site, DIY Robocars, here,…

(image)

Full review at our sister site, DIY Robocars, here, but this is how it starts:

I’m a huge fan of the OpenMV cam, which is a very neat $65 integrated camera and processor with sophisticated built-in computer vision libraries, a Micopython interpreter, and a very slick IDE (a car powered entirely by it came in 2nd in the Thunderhill DIY Robocars race). Now there is a competitor on the block, Jevois, which offers even more power and a lower cost. I’ve now spent a week with it and can report back on how it compares.

(image)

I tested it on an autonomous rover, to see how it performed in a stand-along embedded environment (as opposed to being connected via USB to a PC):

(image)

Read the rest here...




Two-motor VTOL with Pixracer

2017-05-03T04:54:57.000Z

width="1280" height="720" src="https://www.youtube.com/embed/JlALdKd-ZzE?wmode=opaque" frameborder="0" allowfullscreen="">

Impressive design! APM on Pixracer controller. More here

width="1280" height="720" src="https://www.youtube.com/embed/JlALdKd-ZzE?wmode=opaque" frameborder="0" allowfullscreen="">

Impressive design! APM on Pixracer controller. More here