Quantcast
Channel: UGV News | Unmanned Ground Vehicles, Military Robots | Robotics News
Viewing all 446 articles
Browse latest View live

Neurotechnology Announces SentiBotics Mobile Robotics Development Kit 2.0

$
0
0

Sentibotics Mobile RobotNeurotechnology, a developer of robotics and high-precision object recognition and biometric identification technologies, has announced the release of the SentiBotics Development Kit 2.0. SentiBotics is designed to help robotics developers and researchers reduce the time and effort required to develop mobile robots by providing the basic robot infrastructure, hardware, component-tuning and robotic software functionality.

The kit includes a tracked reference mobile robotic platform, a 3D vision system, a modular robotic arm and Robot Operating System (ROS) framework-based software with many proprietary robotics algorithms fully implemented. Full source code, detailed descriptions of the robotics algorithms, hardware documentation and programming samples are also included.

“This new version of the SentiBotics robotics kit contains not only substantial improvements to existing features but additional functionality as well,” said Dr. Povilas Daniusis, Neurotechnology robotics team lead. “These new capabilities not only can save time and effort for developers of mobile manipulation control systems, they also enable SentiBotics to serve as an educational platform for use at universities.”

The new SentiBotics Development Kit 2.0 includes motion planning software and accurate 3D models of the robot, enabling the robot to grasp and manipulate objects while avoiding obstacles. The 3D object recognition and object grasping system also allows the robot to grasp arbitrarily-oriented objects. In addition, Neurotechnology has added the ability to use a simulation engine that enables robotics developers towork in virtual environments.

SentiBotics software includes source code of bio-inspired simultaneous localization and mapping (SLAM), autonomous navigation, 3D object recognition and object grasping systems that are tuned to work with the SentiBotics hardware platform.

New features and upgraded components include:

  • Object delivery – The robot navigates through its previously-mapped locations until it reaches a location where an assigned object was previously recognized. The robot tries to directly recognize the assigned object and will reposition itself until recognition occurs and grasping is possible. The object is then grasped using the robotic arm, placed into the attached box and delivered to a place where the delivery command was given.
  • Object grasping in occluded scenes – The SentiBotics robot can perform path planning for its manipulator, avoiding obstacles that might be between the recognized object and the manipulator itself. If necessary, the robot can automatically reposition itself in order to perform the grasping task. For example, the robot can drive closer or reorient its angle to the object such that it is in the optimal position for picking it up. The SentiBotics robot can automatically determine an object’s orientation and arrange its manipulator in a way best suited for grasping a particular object according to that object’s position in space.
  • Support for simulation engine – Enables the development and testing of robotics algorithms in simulated environments, which can reduce development time.
  • 3D models of the robot – SentiBotics includes 3D models of the mobile platform and robotic arm which are useful for path planning, visualization and simulation.
  • Higher level behavior module – Enables easily programmable, higher-level behavior such as the aforementioned object delivery task, which includes autonomous navigation, object recognition and object grasping.
  • Additional upgrades – Includes more accurate SLAM, 3D object recognition system, improved mobile platform controllers and calibration algorithms.

SentiBotics robot hardware includes the following components:

  • Tracked mobile platform – Includes motor encoders and an inertial measurement unit (IMU), capable of carrying a payload of up to 10kg.
  • Modular robotic arm with seven degrees of freedom – Based on Dynamixel servo motors, capable of lifting objects up to 0.5kg. Each motor provides feedback on position, speed and force.
  • 3D vision system – Allows the robot to measure distances in a range of 0.15 to 3.5 meters.
  • Powerful onboard computer – Intel NUC i5 computer with 8 GB of RAM, 64 GB SSD drive, 802.11N wireless network interface; comes with pre-installed SentiBotics software.
  • Durable 20 AH (LiFePo4) battery with charger.
  • Control pad.

All platform components can be easily obtained from manufacturers and suppliers worldwide, so robotics developers and researchers in private industry, universities and other academic institutions can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials.

The SentiBotics Development Kit also includes:

  • Details of all algorithms used, including descriptions and code documentation.
  • ROS-based infrastructure – Allows users to rapidly integrate third-party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development. SentiBotics 2.0 is based on the ROS-Indigo version.
  • Step-by-step tutorial – Describes how to setup the robot, connect to it and test its capabilities.
  • Hardware documentation and schematic.
  • Demonstration videos and code samples (C++ and Python) – Can be used for testing or demonstration of the robot’s capabilities, including how to:
    -Drive the robot platform and control the robotic arm with the control pad.
    -Build a map of the environment by simply driving the robot around and use this map for autonomous robot navigation.
    -Calibrate the robot.
    -Teach the robot to recognize objects.
    -Grasp a recognized object with the robotic arm, including cases where the grasping scene contains obstacles.
    -Deliver an object that is located in a previously-visited place.

The post Neurotechnology Announces SentiBotics Mobile Robotics Development Kit 2.0 appeared first on Unmanned Systems Technology.


XCMG Launches Fully Remote-Controlled Intelligent Excavator

$
0
0

XCMG XE-15R Remote-Controlled Hydraulic ExcavatorXCMG, a construction equipment company, has developed the company’s first fully remote-controlled excavator – the XE15R. Without a cab, the XE15R is also the company’s smallest excavator at 1.35m in height and 1.08m in width.

The newly developed XE15R features a wireless control function with a 100-meter range, and integrates mechanic, electronic and hydraulic control technology with a CAN bus interface design. Unmanned driving reduces labor intensity, particularly useful in severe operating environments, such as toxic conditions or extreme temperatures.

As an intelligent excavator, XE15R also features a self-learning function. It can save operating maneuvers and then replay them automatically on request, a breakthrough that further ensures security, agility and reliability in operation.

“Intelligentization is an inevitable choice as XCMG follows the path of new industrialization, and XE15R shows that we do what we say. Its design embodies our pursuit of efficiency and user friendliness.” said Wang Min, president of XCMG.

Based on its global collaborative development platform, XCMG has devoted significant effort towards the research and development of core hydraulic parts and intelligent technology, and has successfully applied the results to its excavators and other product series.

 

The post XCMG Launches Fully Remote-Controlled Intelligent Excavator appeared first on Unmanned Systems Technology.

Northrop Grumman Displays Andros FX Unmanned Ground Vehicle at DSEI

$
0
0

Andros FX at DSEINorthrop Grumman Corporation has featured its recently-launched Andros FX unmanned ground vehicle at the DSEI exhibition in London. The firm’s subsidiary Remotec Inc. designed Andros FX to defeat a wide range of threats including vehicle-borne improvised explosive devices.

The most visible features of the Andros FX are the four track pods that replace the traditional Andros articulators, and a new arm design that provides more lift capacity and greater dexterity by adding roll joints that provide nine degrees of freedom. It also features updated system electronics, mobility improvements for increased speed and maneuverability, and a new touchscreen operator control unit with 3-D system graphics, advanced manipulator controls and improved user interface.

“The feedback we have received from EOD teams has indicated what they need most are more capabilities to counter vehicle-borne IEDs,” said Walt Werner, director, Northrop Grumman Remotec. “Andros FX has been designed from the ground up, to meet these requirements and provides the most advanced technology while at the same time making the system easier to use and maintain, and keeping danger at a distance.”

The Andros FX builds on Remotec’s F6 family and its 20-year record as a workhorse for first responders and the military. Like other robots in Remotec’s Andros fleet, its operating system provides much greater information to the operator while easing user workload through more interactivity with intelligent payloads such as chemical and radiation sensors. Preset arm positions and the ability to “fly the gripper” make manipulation of objects much easier, faster and more accurate.

Andros FX was designed using a proven concurrent engineering process, using cross-functional involvement early in the design phase thus resulting in a much lower lifecycle cost, as well as a product that can be quickly adapted for a variety of missions, easily upgraded and expanded, and more efficiently maintained.

The post Northrop Grumman Displays Andros FX Unmanned Ground Vehicle at DSEI appeared first on Unmanned Systems Technology.

Cobham Unmanned Systems Introduces New Explosive Ordnance Robot

$
0
0

Cobham Telemax 4x4Cobham Unmanned Systems, a provider of unmanned security solutions, has introduced a new remotely operated explosive ordnance vehicle to the unmanned systems industry during the 2015 DSEi conference in London.

The telemax 4×4, an optimised derivative of Cobham Unmanned Systems’ suite of telemax robots, incorporates a new four-wheel drive system (top speed of 11.5 Km/h) and WLAN-based data and video transmission technology to achieve improved effectiveness, connectivity and functionality.

Incorporating over 2 decades’ worth of Cobham Unmanned Systems experience in the development and production of explosive ordnance robots, the 4×4 offers the same diverse range of tools, accessories and sensor options as all members of the telemax family, including critical capabilities such as Tool Centre Point (TCP) control and automatic tool change. The utilisation of the mission-proven manipulator arm and other telemax specifications allows for the option to upgrade the 4×4 to the telemax PRO or CBRN at a later date.

Watch the telemax 4×4 in action:

“Cobham’s investment in the development of the telemax 4×4 highlights our commitment to improve our offering to the unmanned systems market through the diversification of our telemax family. The telemax 4×4 offers increased value to the customer, whilst still delivering the high performance and functionality expected of a telemax robot in critical missions,” said Thomas Biehne, Director, Business Development and Sales.

Find suppliers of Robotic Systems >

The post Cobham Unmanned Systems Introduces New Explosive Ordnance Robot appeared first on Unmanned Systems Technology.

ASI Expands HQ Unmanned Vehicle Test Track

$
0
0

ASI new facility upgradeAutonomous Solutions, Inc. (ASI) has announced the completion of facilities expansion involving their unmanned vehicle test track.

This expansion includes a tripling of the test tracks and the creation of more realistic urban scenarios with the needed signage and road striping to develop and test new driverless vehicle algorithms.

“This new facility expansion is critical in the support of our new partners. Our proving ground is a test city, it’s a test mine, it’s a test farm field. We have built into our facility all of these different environments to enable rapid development and rigorous 24/7 product verification and validation testing.” says Mel Torrie, ASI co-founder and CEO.

“This facility expansion not only allows us to do more testing, but also provides more office space for the quickly growing engineering team,” says Mr. Torrie. ASI is currently automating testing for three of the five big automotive companies in Detroit and is expanding into Europe and China with partnerships starting in January. Other markets like Mining, Agriculture, and Construction leverage the same underlying robotics platform foundation but have different “apps” that are customized to the unique needs of each.

The post ASI Expands HQ Unmanned Vehicle Test Track appeared first on Unmanned Systems Technology.

Clearpath Robotics Announces New Self-Driving Warehouse Robot

$
0
0

Clearpath Robotics OTTOClearpath Robotics, a developer of field and service robotics, has announced the release of OTTO, the company’s first self-driving warehouse robot, at RoboBusiness 2015. OTTO is designed for intelligent heavy-load transport in industrial environments to deliver improved throughput and decreased operating costs.

Modern factories and warehouses need to be reconfigurable, responsive, and efficient to survive. Designed to address these conditions, OTTO uses the same underlying self-driving technology popularized by the Google self-driving car. The system delivers dynamic and efficient transport in increasingly congested industrial operations. Traditional material handling systems require costly and rigid changes to infrastructure, cannot adapt to a changing environment, and are not safe for collaboration with warehouse personnel. OTTO does not rely on external infrastructure for navigation, making implementation hassle-free and highly scalable. It can transport 3300 lb loads at speeds up to 4.5 mph, while tracking along optimal paths and safely avoiding collisions.

“North American manufacturers are constantly under pressure to find new ways to gain an edge against low-cost offshore competition. Traditional automation is saturating. But what about the more complex tasks too difficult or expensive to automate?” said Matt Rendall, CEO and Co-Founder of Clearpath Robotics. “We created OTTO to reinvent material transport and give North American manufacturers a new edge.”

Applications for OTTO include moving pallets in a warehouse or cross-dock, and kitting or assembly line delivery. OTTO units are currently deployed in five test facilities, the first of which belongs to General Electric.

GE has collaborated with Clearpath on service robot development since 2013 and recently became one of Clearpath’s first OTTO customers. Clearpath has also announced that GE Ventures has become a strategic investor in the company for an undisclosed sum.

“We believe robotics will drastically improve the industries that GE serves,” said Ralph Taylor-Smith, Managing Director of GE Ventures. “We look forward to further partnering with Clearpath and exploring the role large-scale service robots may play for us and for our customers in the future. This Clearpath investment from GE reflects a deepening of the industrial partnership in advanced manufacturing and field service operations with self-driving vehicles and service robots.”

“GE is one of the world’s most powerful and innovative brands,” said Rendall. “We are honored to partner with GE and we look forward to shaping the industry with them.”

The post Clearpath Robotics Announces New Self-Driving Warehouse Robot appeared first on Unmanned Systems Technology.

Adept Technology Lynx Wins Robotics Business Review Game Changer Award

$
0
0

Adept Technologies LynxAdept Technology, Inc., a developer of autonomous mobile robot solutions, has announced that the company’s Lynx autonomous intelligent vehicle with Acuity navigation has won a Robotics Business Review Game Changer Award. The award was announced during the RoboBusiness 2015 trade show in San Jose, California.

“We are proud that our Lynx mobile robot with Acuity navigation won a Game Changer Award,” said Terry Hannon, Adept’s chief business development and strategy officer. “Lynx robots are providing users around the world with rapid, dependable goods delivery inside their warehouses, factories, and other facilities, improving their operations’ efficiency and safety and lowering costs. Acuity allows the robots to navigate complex facilities even when surroundings change significantly, enabling system deployment into the most dynamic settings.”

Standard Lynx mobile robots can operate in dynamic environments where the surrounding features change up to 80 percent. Acuity is a system option that further enhances a robot’s navigational capability by using overhead static lighting to pinpoint the robot’s location, allowing operation where the floor level environment is even more dynamic. With Acuity, Lynx robots can perform applications such as work-in-progress transport in busy manufacturing settings, materials handing in dynamic warehouses, and can operate effectively in facilities with wide open spaces.

Adept’s Lynx AIVs provide autonomous materials transport in industries ranging from manufacturing and warehousing to healthcare and semiconductor. Unlike traditional AGVs, Lynx systems require no facility modifications to function. The robots intelligently self-navigate, avoiding obstacles, and they automatically select the optimal path to complete a task. Lynx robots work collaboratively with human counterparts. They are quick to deploy, work as single units or in fleets, and can run in conjunction with the user’s existing enterprise management system.

Tom Green, editor in chief of Robotics Business Review, commented: “Each of these fascinating machines had a backstory about it that told much about the people who created them. Most were the product of years of labor and problem solving; some had taken as much as decades to get to our office door. In the process, robotics was well served and expanded upon with the creation of something practical and worthy.”

The post Adept Technology Lynx Wins Robotics Business Review Game Changer Award appeared first on Unmanned Systems Technology.

RE2 Robotics and University of Texas Arlington to Develop Robotic Nursing Assistant

$
0
0

RE2 LogoRE2, Inc., a developer of robotic manipulator arms, has announced that the company is collaborating with the University of Texas Arlington (UTA) to design an Adaptive Robotic Nurse Assistant for the National Science Foundation (NSF).

With nearly 3 million registered nurses employed in the United States, RNs make up the largest pool of healthcare providers in the country. The goal of this robotic nursing assistant project is to provide RNs with assistive robots to support their activities within a hospital setting.

Dan Popa, an associate professor of electrical engineering at UTA, is leading the National Science Foundation Partnerships for Innovation: Building Innovation Capacity grant titled “Adaptive Robotic Nursing Assistants for Physical Tasks in Hospital Environments.”

“We envision our robotic nursing assistant will perform the more routine duties that must be done by nurses daily, such as sitting with a patient that is trying to get out of bed and walking with a patient,” stated Popa. “RE2’s mobile manipulation and robotic nursing design experience make the company an ideal partner for this program.”

RE2’s robotic manipulator arms will serve as the brawn for the robotic nursing assistant to aid patients and reduce on-the-job injuries suffered by nurses during lifting and maneuvering patients.

“We are honored to help design the next-generation robot to assist nurses with their daily tasks,” said Jorgen Pedersen, president and CEO of RE2. “We believe that there is a market for this technology within healthcare environments because of the productivity boost it will offer RNs – allowing them to focus on critical responsibilities while the robot performs the time-consuming mundane tasks.”

The post RE2 Robotics and University of Texas Arlington to Develop Robotic Nursing Assistant appeared first on Unmanned Systems Technology.


ASI Partners With Ford to Develop Robotic Vehicle Testing

$
0
0

Autonomous Solutions Inc. (ASI) has partnered with Ford to further develop ASI’s software and hardware components that enable autonomous, robotic operation of test vehicles. The industry-first technology saves time and spares human drivers from such physically demanding tasks as driving over curbs and through potholes in durability testing.

ASI robotic vehicle testing
A ford truck being run through rigorous durability tests using ASI’s robotic vehicle automation kit. Image: ASI.

“We’re proud to work with Ford to help further develop this technology and to be granted a license from Ford Global Technologies for their patented bell crank components,” said Mel Torrie, CEO, ASI. “The enhancements we’ve made with Ford will improve the durability, reliability and performance of these systems – allowing for even more accurate testing and higher quality vehicles.”

Robotic durability testing includes command and control software as well as a robotics platform installed in the test vehicle that controls vehicle steering, acceleration, braking, transmission, and more. Ford-developed bell crank actuators control the throttle and brake pedals with a metal rod. The module is set to follow a preprogrammed course, and the vehicle’s position is tracked by cameras in a central control room and via GPS accurate to plus/minus one inch.

Through ASI’s Mobius command and control software, operators have the ability to create paths and events for vehicles. An operator can establish repeatable paths that return more precise data results than previously possible due to the robotic platform’s precision.

Robotically driven vehicles are expected to repeatedly perform tests on torturous surfaces at Ford’s proving ground on tracks with names like Silver Creek, Power Hop Hill and Curb Your Enthusiasm. These tests can compress 10 years of daily driving abuse into courses just a few hundred yards long, with surfaces that include broken concrete, cobblestones, metal grates, rough gravel, mud pits and oversized speed bumps.

Watch the video:

The innovative technology is helping to ensure the all-new 2017 F-Series Super Duty – Ford’s toughest, smartest, most capable Super Duty ever – is Built Ford Tough. Super Duty has undergone the equivalent of years of abuse and durability testing in a short amount of time to ensure it will hold up to a lifetime of the hard work its owners expect.

Ford has granted a patent license to ASI, providing the company rights to incorporate and use its bell crank components in the systems ASI sells to other automakers and suppliers to test cars, trucks, buses and military vehicles.

“This robotic testing kit is available to purchase directly from ASI immediately,”
said Chris Danowski, director of technology commercialization and intellectual property licensing, Ford Global Technologies. “Several automotive OEMs have already placed orders to purchase systems for their own testing.”

In use since 2013, Ford’s latest generation of bell cranks has seen significant improvements in reaction time and accuracy of the throttle and brake. Patented new design changes simplify installation, resulting in reduced installation time. Other changes enable fewer modifications to test vehicles and improved system performance with better component response. The system can quickly be deactivated, allowing a test engineer to gain control of the vehicle from the driver’s seat.

The post ASI Partners With Ford to Develop Robotic Vehicle Testing appeared first on Unmanned Systems Technology.

Autonomous Vehicle with Velodyne LiDAR Wins ARGOS Robotics Challenge

$
0
0

ESIGELEC Viking RobotESIGELEC, an engineering school based in Rouen, France, has announced that its Viking Robot – equipped with Velodyne’s VLP-16 3D real-time LiDAR Puck – has taken first place in the ARGOS Challenge.

The ARGOS Challenge is an international robotics competition designed to foster development of a new class of autonomous robots adapted to surveillance and technical maintenance of oil and gas sites.

ESIGELEC’s Team Vikings developed a 6DDL Monte Carlo Localization (MCL) based on VLP-16 measurements, with an absolute position error less than 3cm and an orientation error less than 0.3° at 20Hz. Teams were evaluated on how well each addressed safety requirements, mobility and navigation, data collection, and human machine interface (HMI).

In the first round, the robot had to be able to achieve automatic inspection missions under real conditions. Teams received the script two hours prior to the competition: the robot then needed to pass through control points to read gauges monitoring the state of pressure sensors and to determine whether the floodgates were open or closed. The robot needed to traverse a 20-centimeter-high obstacle in order to automatically climb on or off the platform, and encountered various unexpected obstacles as well. The ESIGELEC Vikings robot was able to perform the first mission on automatic mode in less than four minutes, far quicker than the 20 minutes allotted to the task.

“Clearly, our localization method based on VLP-16 measurements has been a key factor in our success,” said Xavier Savatier, Ph.D., Head of the Instrumentation, IT & Systems Department, ESIGELEC. “What stands out about the VLP-16 are its light weight and size, and its 360° field of view, which helped its integration on the mobile Vikings robot. The sensor’s 3D laser scan strongly reduces the perceptual aliasing problem, which is an issue since there’s so much symmetry in the outdoor site.”

“The ARGOS Challenge dramatically and effectively demonstrates what robots equipped with real time LiDAR sensors can achieve in the demanding oil and gas environment,” said Wolfgang Juchmann, Director of North America Sales and Product Management, Velodyne. “Over the duration of the challenge, robots will attempt vital tasks in simulated offshore or onshore platforms, with total autonomy and under the most extreme conditions. Those are precisely the conditions for which we developed our multi-channel 3D LiDAR sensors.”

In time, robots will be capable of performing inspection tasks, detecting anomalies and intervening in emergency situations. All of the autonomous missions require precise localization, an especially difficult task to perform in an industrial outdoor setting, with pipes in all directions and no walls to define the space.

The ARGOS Challenge research and development process runs for three years and includes three rounds of competition at the test site in Lacq, Midi-Pyrénées, in southern France. The difficulty level will increase during the second round competition in March. At that time, robots will be expected to identify unusual sounds such as gas leaks or cavitation bubbles inside the pipes, and check for unknown obstacles or fire extinguishers. Tests will take place on the upper levels of the structure, requiring robots to ascend and descend flights of stairs.

The post Autonomous Vehicle with Velodyne LiDAR Wins ARGOS Robotics Challenge appeared first on Unmanned Systems Technology.

NASA Awards Humanoid Robots to University R&D Groups

$
0
0
NASA R5 Humanoid Robot
NASA R5 Humanoid Robot

NASA has announced that it has awarded prototypes of its R5 humanoid robot to two universities for advanced research and development work. Through these partnerships, NASA hopes to develop humanoid robots that can help or even take the place of astronauts working in extreme space environments.

Robots, like NASA’s R5, could be used in future NASA missions either as precursor robots performing mission tasks before humans arrive or as human-assistive robots actively collaborating with the human crew. R5 initially was designed to complete disaster-relief maneuvers, however, its main goal is to prove itself worthy of even trickier terrain – deep space exploration.

“Advances in robotics, including human-robotic collaboration, are critical to developing the capabilities required for our journey to Mars,” said Steve Jurczyk, associate administrator for the Space Technology Mission Directorate (STMD) at NASA Headquarters in Washington. “We are excited to engage these university research groups to help NASA with this next big step in robotics technology development.”

The two university proposals selected are:

  • Robust Autonomy for Extreme Space Environments: Hosting R5 at Massachusetts Institute of Technology in Cambridge, Massachusetts, led by principal investigator Russ Tedrake
  • Accessible Testing on Humanoid-Robot-R5 and Evaluation of NASA Administered (ATHENA) Space Robotics Challenge – Northeastern University in Boston, Massachusetts, led by principal investigator Taskin Padir

The two university groups were chosen through a competitive selection process from groups entered in the Defense Advanced Research Projects Agency (DARPA) Robotics Challenge. They also will receive as much as $250,000 a year for two years and have access to onsite and virtual technical support from NASA. STMD’s Game Changing Development Program, which is charged with rapidly maturing innovative technologies that will one day change the way NASA explores space, is funding the research.

The university principal investigators will serve as critical partners in NASA’s upcoming Space Robotics Challenge where the two R5 units will act as instruments. The challenge is part of the agency’s Centennial Challenges Program, and is divided into two competitions: a virtual competition using robotic simulations, and a physical competition using the two upgraded R5 robots. The goal of the challenge is to create better software for dexterous humanoid robots used in space missions, giving them more autonomy.

NASA’s Langley Research Center in Hampton, Virginia, manages the Game Changing Development Program for NASA’s Space Technology Mission Directorate. The Space Technology Mission Directorate is responsible for developing the cross-cutting, pioneering, new technologies and capabilities needed by the agency to achieve its current and future missions.

The post NASA Awards Humanoid Robots to University R&D Groups appeared first on Unmanned Systems Technology.

Velodyne 3D LiDAR Sensors Used to Win Intelligent Vehicle Future Challenge

$
0
0

Intelligent Vehicle Future Challenge autonomous vehicleVelodyne LiDAR, a developer of real-time LiDAR sensor technology, has announced its sensors were used by 17 out of the 20 competitors taking part in the Intelligent Vehicle Future Challenge (IVFC) in China’s Changshu city. The purpose of the event was to showcase autonomous vehicles in action, and the top five finishers all used Velodyne’s HDL-64 sensor.

Inspired by the DARPA Challenge, the IVFC is now in its seventh year, combining a conference dedicated to self-driving vehicles and a competition along a varied 13-kilometer course. The IVFC, underwritten by the National Natural Science Foundation of China, aims to advance perceptions of the natural environment and decision-making for unmanned vehicle platforms. Wei Weng, Velodyne Director of Asia Sales, addressed that theme in his presentation to the future roboticists in attendance.

Among the field of 20 teams, drawn from universities and research entities with some corporate sponsorship, the Military Transportation University contingent again took first place. Hailing from Tianjin, China, the MTU team won for the second consecutive year using Velodyne’s HDL-64 as its core sensing technology. For its effort, the MTU team received a Velodyne VLP-16 LiDAR Puck. Although the HDL-64 sensor was the winning technology, Velodyne is working to provide similar functionality in a form factor and at a price suited to the automotive industry. The company is scheduled to deliver the new sensor to automotive OEMs in the first quarter of 2016.

Changshu city, Jiangsu Province, is home to an economic development zone created specifically to promote the automotive industry. The IVFC course included urban, off-road and highway settings, with teams offered the option of skipping the off-road setting. Vehicles were judged on “4S” – safety, speed, “smartness” and smoothness. The 2015 contest added a new challenge: passing other vehicles, mirroring interest in automated driving lane change features beginning to take hold in the automotive industry.

“For off-road settings, Velodyne LiDAR was the critical sensing technology to simultaneously localize, map and plan a path through the unstructured environment,” Weng said. “This grueling challenge put each LiDAR sensor in our product line – the 16-channel VLP-16 LiDAR Puck and the 32-channel HDL-32E, as well as the HDL-64 – through a series of rigorous real-world tests. We’re delighted that the HDL-64 was able to assist the top five finishers in such a substantial way, and that Velodyne was able to contribute to the education and outcomes of a dozen other teams.”

The post Velodyne 3D LiDAR Sensors Used to Win Intelligent Vehicle Future Challenge appeared first on Unmanned Systems Technology.

TORC Robotics Announces Remote Control System for Caterpillar Loaders

$
0
0

TORC Robotics RemoteTaskTORC Robotics, a provider of unmanned and autonomous ground vehicle solutions, has announced the availability of its RemoteTask remote control system for Caterpillar’s Cat D-Series Skid Steer, Multi-Terrain and Compact Track Loaders. The RemoteTask system enables operators to precisely control the machines from outside the cab as far away as 1,000 ft (300 m) and at a safe distance from potentially hazardous tasks and environments.

“Providing customers a solution to remove the operator from harm’s way while operating Cat compact loaders in certain applications further supports Caterpillar’s commitment to safety,” stated Jeff Griffith, Sr. Market Professional for Caterpillar. “TORC Robotics and Caterpillar have teamed up on the development of this solution for SSLs, MTLs, and CTLs.”

The RemoteTask controls feature virtually no lag in machine response time. With RemoteTask, the remote operator’s performance is as fast and smooth as in-cab operation. The intuitive remote control interface is designed to closely mimic in-cab machine controls, creating an easy operator transition from manual to remote operation.

“When using the RemoteTask controller, the machine response is instantaneous,” said Bob Shoop, Product Demonstrator/Instructor for Caterpillar. “The feel and response mirrors the operator controls of the machine itself. Remote controls I’ve tested previously often had a delayed response.”

Transforming a machine takes about an hour for installation of the RemoteTask system. With the system installed, the machine can transition from manual to remote mode at the turn of a key switch. The system is completely transferable between units.

The portable yet durable console allows the operator to move as needed for good lines of sight. RemoteTask can be integrated into 16 models of Cat D Series Skid Steer, Multi-Terrain and Compact Track Loaders and can control more than 200 work tools with all hydraulic functions controlled via the remote system.

The post TORC Robotics Announces Remote Control System for Caterpillar Loaders appeared first on Unmanned Systems Technology.

US Army Tests Battlefield Unmanned Ground Vehicles

$
0
0

US Army battlefield robotThe US Army has announced that soldiers from its 25th Infantry Division have tested small unmanned ground vehicles that could help them on the battlefield, with the aim of providing intelligence, surveillance and reconnaissance without putting the soldiers in harm’s way.

“This training event has been an exercise to address the basis of issue for Soldier multi-use equipment transport robots in the Pacific region,” said Joseph Alexander, Tank Automotive Research, Development and Engineer Center, or TARDEC, representative.

Working with the Tropic Lightning Soldiers first hand is how Army research labs could extend the reach and capability of a platoon or company.

“Robotics have a very important place in the future of modern warfare. We want to keep the Soldiers out of harm’s way, especially when it comes to mundane activities, and a machine with intelligence or operated with a man in the loop may keep them safe,” Alexander said.

According to a U.S. Army Research, Development and Engineering Command, or RDECOM, article, U.S. Army science and technology advisors initiated this project to field robots capable of assessing chemical, biological, radiological, nuclear, explosives, or CBRNE, threats from a safe distance.

Additionally, the robots caught the eye of a group of combat medics with their ability to evacuate a casualty out of harm’s way during a simulation.

“When you’re carrying a casualty with a Skedco [plastic sled], two guys are out of the fight. Having a robot we will have an effective fire team in the fight,” said Sgt. Michael Murphy, 1st Battalion, 27 Infantry Regiment.

“This would be extremely helpful on the battlefield. The number one thing would be fire superiority and not sustaining any additional casualties,” Murphy added.

Freedom of maneuver is also a goal of this technology. This idea involves equipment taken out of a rucksack and mounted on a robot with the ability to move through a jungle environment.

“We have to give Soldier and Army leadership a level of comfort when using autonomous technology. This is the simple form of this technology. We are hoping to gradually increase that capability, as Soldiers become more comfortable when using robots,” said Drew Downing, RDECOM science advisor to U.S. Army Pacific.

Working hand-in-hand with soldiers and using tactics, techniques and procedures is the formula used to understand how soldiers will use the technology in the future.

“Soldiers are very creative or innovative, and they will find ways to use it, but we need to help them find that out in a controlled environment,” Downing said.

The post US Army Tests Battlefield Unmanned Ground Vehicles appeared first on Unmanned Systems Technology.

Velodyne to Supply LiDAR for Autonomous Transport Shuttles

$
0
0

Velodyne LiDAR pucks in transport shuttleVelodyne LiDAR, a developer of real-time LiDAR sensors, has announced that, in the first deployment of fully autonomous production vehicles, driverless technology specialist NAVYA will implement Velodyne’s real-time 16-channel 3D VLP-16 LiDAR Pucks as part of a two-year, two-vehicle test with Swiss public transport company PostBus Switzerland Ltd.

The fully autonomous, driverless and electric ARMA shuttles will be subject to two-phase testing. The PostBus vehicles will initially be tested on a private, closed site through the spring of 2016. Once pilot-testing is authorized, the two shuttles will be able to run on public roads in the Swiss town of Sion, the capital of the Canton of Valais, and carry passengers in autonomous mode.

NAVYA’s vehicle has already travelled on the open road for the very first time during the Intelligent Transport Systems World Congress (ITS) in Bordeaux. The ARMA will transport up to fifteen people at a maximum of 25 km/h through the streets of the capital of Valais.

NAVYA ARMA steering systems make use of a multitude of technologies simultaneously: GPS RTK navigation devices, stereovision cameras, inertial navigation systems and odometry, in addition to LiDAR. Thanks to its sensors, NAVYA ARMA can position itself to within a few centimeters of its desired target and can identify all types of obstacles on the road – fixed, such as posts, and mobile, such as pedestrians – and signage, in both daylight and at night time.

“This is a turning point for our company and for autonomous technology,” said Christophe Sapet, President at NAVYA in Lyon. “The system has been tested and now will be operational for the public. Velodyne LiDAR is key to the 3D vision system for the ARMA electric and autonomous shuttles.”

“The NAVYA ARMA shuttles, and their deployment by PostBus, represent a major milestone for autonomous transportation,” said Erich Smidt, European Sales Director Velodyne LiDAR. “What is most remarkable is how these various technologies have matured and converged in a solution that is safe, convenient and available to all. Velodyne LiDAR is the very essence of proven technology, having collectively logged millions of hours of use time in virtually every type of terrain and weather condition. A new era truly is beginning.”

ARMA shuttles will enable transport specialists to look into how they integrate into public areas and enable them to test innovative modes of transport. The aim is to offer an additional form of transportation on current routes to cover the needs of all users – in particular, those who live in areas not currently served by the public system.

The post Velodyne to Supply LiDAR for Autonomous Transport Shuttles appeared first on Unmanned Systems Technology.


CMU and Sikorsky Demonstrate Unmanned Collaborative Capabilities

$
0
0

Black Hawk helicopter with UGVCarnegie Mellon University (CMU) and Sikorsky, a developer and manufacturer of helicopters, have announced that they have recently participated in a joint autonomy demonstration that proved the capability of new, ground-air cooperative missions. The demonstration used a UH-60MU Black Hawk helicopter enabled with Sikorsky’s MATRIX Technology and CMU’s Land Tamer autonomous Unmanned Ground Vehicle (UGV).

Such missions could prevent warfighters’ exposure to hazardous conditions, such as chemically or radiologically contaminated areas.

“The teaming of unmanned aerial vehicles (UAVs) and unmanned ground vehicles, as demonstrated here, has enormous potential to bring the future ground commander an adaptable, modular, responsive and smart capability that can evolve as quickly as needed to meet a constantly changing threat,” said Paul Rogers, director, U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC). “The cooperative effort between the Army labs, academia and industry to bring solutions to the warfighter is exciting to see.”

The demonstration was performed for TARDEC through the Robotics Technology Consortium, which sponsored the Extending the Reach of the Warfighter through Robotics (ERWR) project.

The Black Hawk helicopter was provided by the U.S. Army Aviation and Missile Research Development and Engineering Center’s Aviation Applied Technology Directorate and was modified with the Sikorsky autonomy kit, MATRIX, to deliver the UAV capabilities the program required.

“The UH-60MU aircraft is a prototype of the UH-60 in a ‘fly-by-wire’ configuration,” said William D. Lewis, AMRDEC director of Aviation Development. “‘Fly-by-wire’ technology is the foundational enabler that facilitates autonomous aircraft operations.”

MATRIX Technology, launched in 2013, was designed to improve the capability, reliability and safety of flight for autonomous and optionally piloted vertical take-off and landing aircraft.

In the demonstration, the helicopter was operated in coordination with a UGV, developed by Carnegie Mellon’s National Robotics Engineering Center (NREC). The UGV Land Tamer all-terrain vehicle combined key elements of several NREC world-class autonomous systems to support missions in difficult environments.

“We were able to demonstrate a new technological capability that combines the strengths of air and ground vehicles,” said Jeremy Searock, NREC technical project manager. “The helicopter provides long-range capability and access to remote areas, while the ground vehicle has long endurance and high-precision sensing.”

During the demonstration mission, the unmanned helicopter picked up the UGV, flew a 12-mile route, delivered it to a ground location and released it. The drop-zone collaboration between the two autonomous systems demonstrated a uniquely differentiating capability.

Over the course of more than six miles, the UGV autonomously navigated the environment, while using its onboard chemical, biological, radiological and nuclear (CBRN) sensors to detect simulated hazards and delivered this information back to a remote ground station. The UGV was optionally teleoperated to explore hazard sites in greater detail, when necessary.

“We invested in Matrix Technology because we knew it would mean that, in certain scenarios, the warfighter can be kept out of harm’s way and would be able to perform more missions and perform them more effectively,” said Mark Miller, vice president of Research & Engineering at Sikorsky, a leader in helicopter design, manufacture and service. “This demonstration indicated just that.”

The exercise, at Sikorsky’s Development Flight Center, West Palm Beach, Fla., culminated a 19-month project between Carnegie Mellon’s NREC and Sikorsky to demonstrate for the Army autonomous delivery of a UGV by an Optionally Piloted or Unmanned Black Hawk helicopter, followed by a long-range autonomous ground mission to collect vital, on- the-ground intelligence. The collaboration between the UAV and the UGV demonstrated the effectiveness of unmanned systems in addressing logistics needs in unknown or dangerous environments.

The post CMU and Sikorsky Demonstrate Unmanned Collaborative Capabilities appeared first on Unmanned Systems Technology.

Velodyne LiDAR Sensors to be Used in US Army Autonomous Shuttles

$
0
0

Velodyne LiDAR on autonomous shuttleVelodyne LiDAR has announced that military contractor Robotic Research, LLC has successfully deployed Velodyne’s VLP-16 real-time 3D LiDAR sensor on an autonomous shuttle, as part of the Applied Robotics for Installations and Base Operations (ARIBO) program. The program seeks to improve services at posts like Fort Bragg, in North Carolina, meeting the critical needs of soldiers on the base.

Under a contract serving engineers from the U.S. Army’s Tank and Automotive Research, Development and Engineering Center, Robotic Research led a new pilot program featuring an autonomously-guided vehicle at Fort Bragg. The vehicle, limited to 15 miles per hour, is expected to be fully autonomous and designed to ferry soldiers from Fort Bragg’s Warrior Transition Unit barracks to Womack Army Medical Center nearly a half-mile away.

“As you would expect on a very large military base, parking is a problem,” said Edward Straub, program lead for ARIBO. “Half the time, according to user studies, soldiers have to park so far out in the parking lots that they walk almost as far as they would from the barracks but have spent a half-hour looking for a parking spot so end up late for their appointments.” Missed appointments can be a significant issue and added costs at medical centers on military bases, and for more reasons than just scheduling difficulties.

“We’re excited about this because our soldiers are able to avoid relying on their own transportation and parking,” said Dennis Small, Deputy Commander for Fort Bragg’s Warrior Transition Unit. “To be able to schedule an autonomous vehicle and tailor it to their schedule, avoiding a sometimes laborious walk that may often be in inclement weather like heat category five is important.”

The Hon. Katherine Hammack, Assistant Secretary of the Army, visited the Fort Bragg Warrior Transition Battalion, WTB, last December, to get an update on Womack Army Medical Center’s driverless vehicle testing. Robotic Research and Velodyne were honored to have such a distinguished visitor take part in the ride that wounded warriors will also be taking. ARIBO will transport wounded warriors from the WTB to Womack Army Medical Center, starting early this year.

In addition to cost savings, the pilot will provide TARDEC (Tank Automotive Research, Development and Engineering Center) engineers with valuable data to grow the program in the future.

“This pilot project will give us information we need to expand the use of automated vehicles to other areas and enable a long-term transportation strategy that involves automated, on-demand transportation,” Straub said. “The implications reach far beyond the immediate pilot group at Womack Army Medical Center and the Warrior Transition Battalion. I’m very excited about this program – it’s an opportunity to learn more about the technology and to determine what we’re capable of doing with the technology, finding the right applications where it can be used.”

In addition to ARIBO, the National Advanced Mobility Consortium (NAMC) and U.S. Army TARDEC recently tapped Robotic Research for its engineering expertise in autonomous software architecture. During the six-year contract, Robotic Research, which is based in Gaithersburg, Maryland, will integrate software on the Autonomous Mobility Applique System (AMAS) Autonomy Kit as part of TARDEC’s Autonomous Ground Resupply (AGR) program.

Under the AMAS/AGR program – and as a result of the ARIBO program success – Robotic Research will be evaluating Velodyne’s VLP-16, as part of the development of a fault-tolerant vehicle-agnostic applique kit, to perform higher-level autonomous driving and planning functions. The Autonomy Kit will implement interoperability profile (IOP) and contain robotic modes including, but not limited to, teleoperation, waypoint navigation, and leader follower. The system will be used on current and future autonomous vehicles and to address the needs of the Autonomous Convoy Operations (ACO) Program of Record (POR).

“We’re encouraged by the NAMC award to continue on our successful path in autonomous software architecture,” said Alberto Lacaze, President, Robotic Research. “This award positions us to prepare for the integration of our software architecture onto other large vehicle platforms. We’re thrilled to be a major part of this Program of Record, and to incorporate Velodyne’s market-leading VLP-16 LiDAR sensor as part of our solution.” Robotic Research’s autonomous software architecture will build off of lessons learned during the AMAS Joint Capability Technology Demonstration (JCTD) to develop the prototype Autonomy Kit.

“Velodyne LiDAR has become the de facto standard for autonomous vehicles, and we’re delighted to participate in programs with Robotic Research,” said David Oroshnik, Director of Technical Solutions, Velodyne LiDAR. “Alberto and his team have been tremendously supportive in adapting the VLP-16 for this application. The Army’s requirements are exacting, and we look forward to helping advance the state of the art for autonomous shuttles.”

Robotic Research is an engineering firm founded by Lacaze and Karl Murphy in 2002, during a resurgence of military interest in unmanned vehicles. Both worked together in the Intelligent Systems Division of the National Institute of Standards and Technology (NIST) in Gaithersburg, when they decided to pursue their interests in unmanned vehicles, artificial intelligence and various unique projects.

The Robotic Research, LLC plays key roles in major Department of Defense unmanned ground system programs. In conjunction with General Dynamics Robotic Systems, Robotic Research team members designed and developed the autonomous mobility software for most autonomous ground robotic systems currently in use by the Army.

The post Velodyne LiDAR Sensors to be Used in US Army Autonomous Shuttles appeared first on Unmanned Systems Technology.

WPI Robotics Team to Train Autonomous Firefighting Robot

$
0
0

US Navy Firefighting RobotWorcester Polytechnic Institute (WPI) has announced that one of its professors has received nearly $600,000 from the Office of Naval Research (ONR) to develop motion planning algorithms for humanoid robots that are designed to fight fires aboard U.S. Navy ships.

With the award, Dmitry Berenson, PhD, assistant professor of computer science and robotics engineering at WPI, will develop software for a firefighting robot built by Virginia Tech engineering students. The project has also been supported by University of Pennsylvania and Carnegie Mellon University robotics teams. Dubbed SAFFiR (short for Shipboard Autonomous Firefighting Robot), the robot is designed to fight fires and perform other maintenance tasks. SAFFiR — which stands 5 foot 10 inches tall and weighs about 140 pounds — has already been put through an initial firefighting exercise on a ship and will undergo more specialized tests with WPI’s help.

“By using autonomous humanoids, we’re hoping to reduce the need for Navy personnel who have to perform a whole host of tasks and to also help mitigate the risks to people in fire suppression scenarios,” said Berenson.

Thomas McKenna, PhD, a program officer for ONR, said the program was developed to support Navy personnel. “Substantial losses occur when you have a major fire on a ship and can’t suppress it at an early stage,” said McKenna, who added that ships often carry ordnance and other flammable systems.

Naval officials also noted that it can be difficult to keep sailors up-to-date on resource training, but that a robot could be continually reprogrammed with the latest processes.

Berenson said he believes WPI was chosen for this project because of its leadership in motion planning research. “We could contribute our unique experience with motion planning for humanoid robots, which must perform in complicated scenarios,” he said. “Our focus on motion planning for autonomous robots, and not just those that are controlled by tele-operation, also helped us secure the grant.”

Berenson said humanoid robots are not particularly adept at moving in constrained quarters such as a Navy submarine, where they may have to navigate tight corners and stay upright despite rocking decks. “These robots are not good at locomotion in complex environments right now,” he noted.

With that in mind, WPI is testing its algorithms in a simulation of a complex, constrained environment using a virtual model of the SAFFiR robot. The team can plan a variety of movements for the robot and then see if it is able to walk correctly or if it falls down. “Of course,” he said, “just because it works in the simulator doesn’t mean it will work in the real world.”

The WPI team will test its algorithms in the actual robot in controlled conditions within a ship environment mock-up at Virginia Tech. The robot will walk up stairs and perform other locomotion tasks “to ensure the algorithms are generating the correct motion,” Berenson said.

In a later phase of the research, the robot’s capabilities will be tested again aboard the USS Shadwell, a decommissioned U.S. Navy landing ship docked in Mobile Bay, Ala.

In November 2014, Virginia Tech engineering students conducted a three-day demo aboard the Shadwell, culminating in an exercise that positioned the SAFFiR robot in front of a fire. At that time, engineering students said the tethered robot managed to locate and walk toward the fire, grab a fire hose, and blast the flames with water.

With the advanced motion capabilities provided by WPI’s algorithms, Virginia Tech’s robotics leaders hope the robot will be able to demonstrate even greater capabilities.

“The combination of a whole body controller developed by Virginia Tech and the WPI planning software will allow the robot to conduct more complex motions for navigating through the ship and effectively manipulating complex objects like firefighting hoses,” said Brian Lattimer, PhD, affiliate professor of mechanical engineering at Virginia Tech and vice president of research and development at Jensen Hughes, who advised students in the 2014 demo.

The post WPI Robotics Team to Train Autonomous Firefighting Robot appeared first on Unmanned Systems Technology.

Airbus Group Innovations to Develop Humanoid Robots

$
0
0

Joint Robotics Laboratory HRP-2 robotAirbus Group Innovations (AGI), the global research and technology network of aeronautics and space technology developer Airbus Group, has announced the launch of a joint robotics research programme with Japan’s National Institute of Advanced Industrial Science and Technology (AIST) and France’s National Centre for Scientific Research (CNRS).

The programme will be dedicated to the research and development of humanoid robotic technology to perform complex manufacturing tasks in factories. The majority of research will be conducted at the CNRS-AIST Joint Robotics Laboratory (JRL), which was established in 2004 on the AIST campus in Tsukuba, Japan.

Satoshi Sekiguchi, Director General of the Department of Information Technology and Human Factors at AIST, inaugurated the new project together with Jean-Yves Marzin, Head of CNRS’s Institute for Engineering and Systems Sciences (INSIS), and Sébastien Remy, Head of Airbus Group Innovations. The signing of the partnership agreement was witnessed by France’s Ambassador to Japan, His Excellency Thierry Dana on the premises of the French Republic’s Embassy.

“The use of robotics has become ubiquitous in our industry,” notes Remy. “Both AIST and CNRS researchers are at the cutting edge of humanoid robotics research, and we are excited about the opportunity to meld our expertise with theirs on the further development of this key technology for manufacturing.”

Airbus Group and the CNRS-AIST JRL are also collaborating on COMANOID, a four-year research project launched in early 2015 as part of the European Commission’s Horizon H2020 programme, which aims at deploying humanoid robots to achieve non-added value tasks that have been identified by Airbus Group in civilian airliner assembly operations.

Introducing humanoid technology into aeronautical assembly lines is expected to support human operators in performing the most tedious and dangerous parts of the manufacturing process, freeing up highly skilled workers to perform higher, value-added tasks. Designing robots with a humanoid form will enhance both their dexterity and versatility, making them suitable for tackling a large range of tasks in a variety of environments – all without having to make significant changes to manufacturing processes originally designed for humans.

Realising viable humanoid robotics, however, will require researchers to develop new algorithms in multi-contact planning and control to give robots the sort of human ‘hand-eye coordination’ that will allow them to function effectively in confined and poorly accessible spaces. These algorithms will be tested on a set of use-cases drawn from different Airbus Group divisions and plants, in which the realism and complexity will be increased every year.

Project research will be based on the JRL’s HRP-2 and HRP-4 robots (human-sized humanoid research platforms), and demonstrations of the use-cases will be performed at different Airbus Group production sites around the globe. The project will be supervised by a scientific board composed of Airbus Group, AIST and CNRS members, and a steering board including representatives of all three project partners and members of the Japanese Society for the Promotion of Science (JSPS) and the Japanese Ministry of Economy, Trade and Industry (METI).

The post Airbus Group Innovations to Develop Humanoid Robots appeared first on Unmanned Systems Technology.

Milrem Unveils New Modular Hybrid Unmanned Ground Vehicle

$
0
0

Milrem THeMIS ADDER UGVMilrem, a defence solutions provider specialising in military engineering, has unveiled its new modular hybrid unmanned ground vehicle (UGV), THeMIS (Tracked Hybrid Modular Infantry System), at the Singapore Airshow 2016.

A multi-mission vehicle platform that can assist and replace soldiers on the battlefield in complex and hazardous tasks, THeMIS is able to reduce operational risks and work as a force multiplier.

Together with Singapore Technologies Kinetics (ST Kinetics), Milrem has also developed the THeMIS ADDER, a variant of THeMIS equipped with ST Kinetics’ remote weapon station, the RWS ADDER.

THeMIS is a highly modular platform that allows different superstructures to be mounted and integrated onto the middle vehicular platform for complex missions such as rescue, transport, reconnaissance. The flexibility and versatile nature of the system not only increases efficiency, it also significantly reduces the life cycle costs of these complex unmanned systems with simplified maintenance and spare supplies.

Watch the video:

“Unmanned systems will play a significant role in the development of military capabilities in the future. Within the next 10 years, we will see smart ground systems complementing the human troops during joint missions. We are excited to be in cooperation with ST Kinetics for THeMIS ADDER, and we’re sure that this universal UGV concept will effectively supplement defence capabilities on a battalion level,” said Kuldar Väärsi, CEO of Milrem.

In line with THeMIS, Milrem has also launched the Digital Infantry Battlefield Solution (DIBS), a tactical usage of smart unmanned systems up to battalion level. This is a program in development in conjunction with the Estonian National Defence College. DIBS analyses a wide variety of tactical user cases and provides new approaches to warfare doctrines. Practical solutions will be tested in corporation with Estonian Defence Forces.

Milrem has successfully conducted initial running tests for the THeMIS prototype, and THeMIS will be ready for production by the end of the year.

The post Milrem Unveils New Modular Hybrid Unmanned Ground Vehicle appeared first on Unmanned Systems Technology.

Viewing all 446 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>