Products & Services
What is the RoboCar 1/10 series?
"Since its launch in 2009, RoboCar 1/10 has been used in various applications such as research and development of Autonomous Driving and AI development at research and educational institutions such as automobile / parts manufacturers and universities. The estimated sales volume till date is more than 600 units. A monocular camera and a laser range sensor are mounted on a 1/10 scale vehicle of an automobile, and the behavior and mileage of the vehicle can be grasped by the acceleration / gyro sensor and encoder. In addition, we have prepared libraries for acquisition of various sensor information, speed / steering angle control, communication, etc., and customers can freely develop applications using these libraries. "

Robot car RoboCar 1/10X for Autonomous driving / AI technology development 

Autonomous Driving development platform RoboCar 1/10X

Autonomous Driving / ADAS development tools!
Equipped with NVIDIA Jetson AGX Xavier that supports ROS

Overview

The latest model, RoboCar 1 / 10X, uses the NVIDIA Jetson AGX Xavier developer kit with GPU, which enables the implementation of advanced AI algorithms. In addition, various OSs and libraries for autonomous driving and AI applications such as ROS (Robot Operating System) are already installed, and development can be started from the day of unpacking.

Its features are

・ It is possible to experiment indoors by 1/10 scale of actual vehicle
・ NVIDIA processor facilitates autonomous driving development utilizing image recognition and AI learning
・ It's possible to add custom sensors and use wireless communication with robot

It is a development platform where you can start developing and experimenting with Autonomous Driving from the day it is introduced.

It also supports ROS (Robot Operating System), an open source software platform for robots that is used in R&D for Autonomous Driving. So it is possible to use various ROS based software developed around the world.
RoboCar1 / 10X image video

System configuration

On-board sensor

Equipped with a front camera and front and rear LiDAR, it is possible to grasp the behavior and mileage of the vehicle with the acceleration / gyro sensor and encoder.
We also have a library for acquiring various sensor information, speed/steering angle control, and communication. Customers can freely develop applications using these.
In addition, the CPU performance has been greatly improved, and control development using image processing and real-time recognition processing using deep learning can be performed.
About on-board sensor

ROS(Robot Operating System)

What is ROS

ROS is an abbreviation for Robot Operating System, and although it has the name OS, it is different from operating systems such as Windows, Linux, and macOS, and is open source software that includes development tools and libraries.
It was developed by Willow Garage, a US company that develops robot software, and is currently managed by an NPO called "OSRF" (Open Source Robotics Foundation).

Nowadays, it is open to the world, and anyone can utilize the technology and know-how developed in the past, and it is actively utilized in universities and research institutes. Utilization of ROS in autonomous driving development is a suitable platform for development, but it was said that many important functions were lacking for use in safety-oriented environments such as vehicle control systems. ..

However, in addition to all the hardware abstraction, device drivers, libraries, visualization tools, message communication, and package management functions of ROS these days, ROS2 has also been released with additional functions including security, real-time control, network quality control, simultaneous use of multiple robots, and commercial support. Its use in research and development of autonomous driving is being actively considered.

Now available on Github

GitHub is a software development platform that can host source code. By hosting, you can collaborate with multiple software developers to review the code, and it also has an SNS function, which is provided as feeds and followers.
Sample code is provided free of charge as an account for open source projects.

RoboCar1 / 10X | Full sample application

NEW !!

Simulator application (V1.5 or later)

We also created a model of RoboCar 1 / 10X in cooperation with the ROS-based Gazebo simulator.
By using this sample app, it is possible to check the driving behavior and sensor information of the vehicle in the virtual space.
Gazebo Simulator is a virtual simulator tool for robots operated by ROS. You can use multiple high-performance physics engines such as ODE and Bullet. It provides realistic rendering of environments such as high quality lighting, shadows and textures, allowing you to model simulated environments such as laser rangefinders, cameras and Kinect-style sensors.

There are 2 main ways to use Gazebo.
I. Install an existing robot simulation model, change the simulation conditions, and perform verification.
Ⅱ. Define everything from the robot model construction by yourself and perform the simulation.

As for I, we have already built a model of RoboCar 1 / 10X, so you can use it immediately.
RoboCar1 / 10X Simulator App Video
Gazebo app video

Navigation application (V1.4 or later)

Using the MAP created by SLAM as a navigation tool, you will be able to generate a route and drive the RoboCar 1 / 10X automatically.
When you enter the goal point on the sample application screen, the travel route is automatically generated. The "Sampling-based motion planning method" is used to generate this travel route.
In this method represented by RRT (Rapidly exploring Random Tree) and PRM (Probabilistic RoadMap), the initial position is searched by performing a graph search by randomly sampling in the configuration space and constructing a network of free routes called a roadmap. Create a route from to the desired location.
It has evolved rapidly over the last decade and has an online route replanning system configured to immediately replan and generate interference-free routes when interference with obstacles is predicted.
After that, the generated route will be used to enable the RoboCar 1 / 10X to Autonomous Driving autonomously.
Navigation app introduction video

Object detection application (V1.3 or later)

[SSD (Single Shot MultiBox Detector)]
An algorithm for general object detection using machine learning. Using deep learning technology, it can detect many types of objects at high speed. You can memorize a specific object and detect that specific object.
As the name "Single Shot" implies, SSDs perform both "area candidate detection" and "classification" of objects in a single CNN operation. This is an algorithm that makes it possible to speed up the object detection process.
The SSD network uses the model used for image classification for the beam. The structure is the one with the convolution layer added.
At the time of prediction, the beam and the object is detected. Specifically, a beam to extract class features and position features.
Object detection app introduction video
Object detection result sample

SLAM sample application (V1.2 or later)

RoboCar 1/10X "SLAM Package" Introduction Video
[Cartographer]
It is an open source algorithm that integrates multiple platforms and sensors in real time and provides self-position estimation and mapping at the same time (SLAM) in 2D or 3D. It is used as an important component of autonomous robots such as robot vacuum cleaners, automatic forklifts, Autonomous Driving
2D MAP sample
[Hector SLAM]
Open source software developed by members of the robot research team "Hector" at Technische Universität Darmstadt, Germany, that can be used in the ROS system environment, like robots operated by Urban Search and Rescue (USAR). This algorithm was developed for use in situations where it is important to create an environmental map in an unknown environment, and aims to estimate the posture of 6 axes without relying on odometry.
2D MAP sample
[Other]
・ Steering & drive motor control
・ Various sensor values & camera image acquisition
・ Obstacle avoidance using laser range sensor
・ Data communication via wireless Wi-Fi
・ Remote control etc.
Sample application image

Product Specifications

Classification Item Specifications
Body
 
 
 
 
 
 
 
 
 
Length x Width x Height / Weight 190 × 429 × 150 [mm] & 3.0 [kg]
Maximum load weight 1kg
Minimum turning radius About 500 [mm]
Maximum speed About 10 [km/h]
Chassis frame Aluminum chassis, double wishbone suspension, ZMP aluminum frame
Motor For drive: Small DC motor / For steering: Servo motor for robot
Battery

Control battery (optional): Dedicated Li-ion battery pack (x1)

Drive battery: Nickel-metal hydride battery pack (7.2 [V] x 1)

On-board sensor

Monocular USB camera x 1 (front): 1920 x 1080 [RAW], 60 [fps], 139 [deg], equipped with CMOS image sensor

Laser range sensor x 2 (front and back): Detection distance 20-5,600 [mm], 240 [deg]

Gyro (1 axis), acceleration (3 axes), rotary encoder (wheel x 4, motor x 1, steering x 1)

In-vehicle CPU

NVIDIA Jetson AGX Xavier (8-core ARM v8.2 64-bit)

GPU: 512-core Volta GPU With Tensor Cores, RAM: 32GB, SSD: 1TB

WIFI IEEE802.11b / g / n / ac WEP / WPA, 2.4GHz / 5GHz
Software on the main unit side
 
 
OS Linux (Ubuntu 18.04)
Supported libraries

ROS, CUDA cuDNN, TensorFlow, PyTorch, OpenCV, PCL

Sample program Vehicle control, sensor information acquisition, LAN communication, obstacle avoidance by LRF, remote control, SLAM (Hector, Cartographer)
Accessories Joystick controller, control / drive battery charger
* Specifications of this product are subject to change without notice.

Extended performance & examples of system cooperation​ ​

Using RoboCar 1/10X as a hardware platform, control development with standard sensors and additional sensors can be performed.
It can also be used for controlling multiple vehicles using communication functions, developing road-to-vehicle cooperation systems that link vehicles with surrounding environments (signals, external devices, etc.), and for developing remote monitoring systems.
RoboCar 1/10X system configuration and use cases

Comparison with the conventional machine

"New points" that can be understood in 3 minutes
Comparison between RoboCar 1/10X and conventional model

RoboCar 1/10 series introduction example

Since the launch of the RoboCar1/10 series in 2009, more than 300 units have been introduced at educational institutions.
Please confirm the details of the introduction results from the link below.

Price

【Selling price】

Regular price 1.8 million JPY (excluding tax) / Academic price 1.44 million JPY (excluding tax) (software development environment (SDK) included)


【Rental price】

150,000 JPY (excluding tax) / month * The period starts from first month.

Product catalogue

RoboCar 1 / 10X support

This page contains RoboCar 1 / 10X manuals, software updates, and options.
* To avail the service you need ID and Password.

Product Inquiry

Any inquiries before product purchase like product demo request is accepted.

Related products

POWER WHEEL II
POWER WHEEL II
Mobile trolley research and development platform

RoboCar® SUV
A vehicle for Autonomous Driving which is based on a commercially available SUV vehicle
RoboCar® MiniVan
A vehicle for Autonomous Driving which is based on a commercially available MiniVan vehicle
Kilobot
Research platform to promote understanding of group control and realize wide range of possibilities
To TOP
03-5844-6210