My Version of a Robot Tutorial

After working with ROS for several years and looking at the difficulty of getting something to work that is not a canned tutorial can you publish and subscribe. The important aspect is how to I get my robot idea to be functional.

Building a robot is hard. Building a robot in simulation is just as hard. Getting your idea to work within a robot is almost impossible. I am going to document my process in creating a Perception module that consists of a 2D Lidar and a dual Fisheye 360° camera. This system will be used as an add plug-in for 3D modeling.

Parts list 3D mapper.

  • SICK Tim 561
  • Razor IMU
  • Theta V 360 camera
  • Robotis MX-64 Servo
  • Servo Driver

Parts list Point of Interest

  • Cameras,
  • point lidar
  • razor imu .
  • Lighting
  • Laser Pointer

So how to get started,

I always start with the very basic for any component.

  • manuals,
  • drawings SDF URDF
  • power
  • interface cabling
  • mounting
  • setup, configuration.
  • ros package,

These items should be added to your code repo and package documentation.

DARPA Subterranean “Subt” Challenge

Maxed-Out is now registered and trying to field an autonomous robot to compete in the physical part of the Darpa Subterranean Challenge Details are on their website. http://subtchallenge.com

This is a scavenger hunt in the dark. Difference is we only have to accurately map the objects we do not need to pick up and bring back. Like the NASA Sample Return Challenge. Below is a video outlining the tunnel portion of the challenge.

August 19-24 2019. The real challenge is being run so look forward to seeing how the other teams worked through the problems and did in a real run. There a only 9 teams and I believe all of them are funded by DARPA. One team will be bring home some winning. Just interested in seeing how many artifacts the winning team accurately mapped.

Ubuntu and Melodic.

The challenge is using ROS Melodic which runs on ubuntu 18.04 on raspberry pi’s and Jetsons.  Lots of issues since neither the Raspberry or Jetson TX2 support 18.04 yet. 

So far have updated the following to Ubuntu 18.04 and ROS Melodic 

  • NUC will start running the base station AR tracking from it.  
  • Raspberry Pi  opencv  image and will start creating cameras wi raspi_camera nodes running.  
  • Fang laptop  can run the OSRF  simulation they created for the SubT challenge.

Have not had luck with the Jetson TX2’s that will be the brain stem system for Max.

New year, New location, New start

Hello Everyone, Bring Maxed-Out back to life here in Gig Harbor.
We moved up the the Pacific Northwest in June 2017.
Purchased a house in November 2017. After getting settled and acclimated time to forget the past and focus on the future of Max.
I am fortunate to be working with a large group of young college grads, who are showing interest in helping get Max doing something useful.
So time will tell, but the first thing is to get the site back up and running, dust of the slack channel and code and get Max do autonomous tasks.