Quantcast
Channel: UGV News | Unmanned Ground Vehicles, Military Robots | Robotics News
Viewing all 446 articles
Browse latest View live

British Armed Forces Test UGVs

$
0
0

Milrem Titan UGVs

Milrem Robotics has announced that four of its Titan unmanned ground vehicles (UGVs), developed in conjunction with Qinetiq, were put through three weeks of rigorous tests by British troops during the Army Warfighting Experiment 2018 (AWE18) – Autonomous Warrior (Land). With four vehicles, Milrem Robotics was the most represented UGV manufacturer in the exercise.

“The main goal of the experiment, which concluded last week, was to determine how new unmanned technologies can enhance soldier survivability and effectiveness on the modern battlefield,” explained (Cpt, ret) Juri Pajuste, Program Director at Milrem Robotics, who took part in the exercises.

The test was conducted in three phases: combat operations without the benefit of new technologies; combat operations using new technologies but without changing tactics; and lastly, combat operations using new technologies and adapting tactics according to the capability that the new technology provides. The UGVs were used in a number of different roles with missions conducted in urban, open and forested terrain.

“The feedback from the users was very positive and they were surprised how agile and durable Milrem’s UGV is,” Pajuste added. Of the four Milrem Robotics developed UGVs, two were deployed by Milrem Robotics and two by QinetiQ. The Milrem fielded systems included one configured as a casualty evacuation and logistical support unit and a second unit equipped with a tethered multi-rotor drone pod provided by Threod Systems.

One of the four UGVs was TITAN Strike, a prototype system carrying a Kongsberg remote weapon station, fully controlled by a remote operator and using QinetiQ’s Pointer system as a means of integrating the capability with dismounted infantry. The second system, TITAN Sentry, also enabled with Pointer, featured a Hensoldt provided sensor suite including electro optical and thermal imaging cameras and a battlefield radar.

Keith Mallon, campaign manager at QinetiQ, said: “AWE 18 is the conclusion of months of hard work, maturing TITAN Sentry and TITAN Strike. We have enjoyed working closely with Milrem Robotics and are looking forward to future collaboration, working together with the world leading THeMIS platform.”

QinetiQ also utilized one of the 2 TITAN platforms in a logistics configuration as part of its work on the UK’s Autonomous Last Mile Resupply challenge, with demonstrations taking place alongside AWE18.

The post British Armed Forces Test UGVs appeared first on Unmanned Systems Technology.


First Unmanned Public Delivery Service Launched

$
0
0

Kroger and Nuro unmanned delivery vehicle

Robotics firm Nuro and The Kroger Co. have announced the launch of the first-ever unmanned delivery service available to the general public.

The companies have developed a self-driving grocery delivery service in Scottsdale, Arizona, which uses an autonomous Prius fleet accompanied by vehicle operators. The autonomous vehicles have already completed nearly one thousand deliveries to the general public.

The two companies are now expanding the fleet to include Nuro’s custom unmanned vehicle known as the R1. The R1 travels on public roads and has no driver, no passengers and only transports goods. Nuro has been developing the R1 since 2016, and recently announced its partnership with Kroger.

“Nuro envisions a world without errands, where everything is on-demand and can be delivered affordably. Operating a delivery service using our custom unmanned vehicles is an important first step toward that goal,” explained Nuro President and co-founder Dave Ferguson.

“Kroger customers are looking for new, convenient ways to feed their families and purchase the products they need quickly through services like pickup and delivery,” said Yael Cosset, Kroger’s chief digital officer. “Our autonomous delivery pilot with Nuro over the past few months continues to prove the benefit of the flexible and reliable technology. Through this exciting and innovative partnership, we are delivering a great customer experience and advancing Kroger’s commitment to redefine the grocery experience by creating an ecosystem that offers our customers anything, anytime, and anywhere.”

The post First Unmanned Public Delivery Service Launched appeared first on Unmanned Systems Technology.

British Army Receives Bomb Disposal UGVs

$
0
0

Harris T7 bomb disposal UGV

The UK Ministry of Defence has announced that the British Army have taken delivery of the first four of 56 new bomb disposal robots following rigorous trials. The Harris Corporation’s T7 unmanned ground vehicles (UGV) are equipped with high-definition cameras, high-speed datalinks, an adjustable manipulation arm, and tough all-terrain treads, allowing them to neutralise a wide range of explosive threats.

The bomb disposal platform endured a variety of tests during an eight-week ‘acceptance’ trials period at UK and US sites specifically chosen to put the robots through their paces. The systems were pushed to their limits by trials including multi-terrain driving, a series of battlefield missions, weightlifting and dexterity tasks, climatic and vibration testing, high stress capabilities, live-firings, maximum traversing angles and interoperability assessments.

Defence Secretary Gavin Williamson said: “These robots will go on to be an essential piece of kit, preventing harm to innocent civilians and the brave operators who make explosives safe. The robots will provide the Army with the latest bomb-disposal technology and will prove to be trusted companions both on UK streets and in deadly conflict zones.”

Col Zac Scott, Head of the Defence EOD & Search Branch said: “Remote Control Vehicles (RCVs) are critical to the safe conduct of Explosive Ordnance Disposal (EOD) tasks. The Harris T7 harnesses cutting-edge technology to provide EOD operators with unprecedented levels of mobility and dexterity. It represents a step-change in capability for our service personnel and it will save lives.”

The bomb disposal robots have been procured by Defence Equipment and Support, the MOD’s procurement organisation, under Project Starter. The deal was originally announced at the 2017 Defence and Security Equipment International (DSEI) Exhibition in London.

Project Starter will procure 56 Harris T7s to support Explosive Ordnance Disposal (EOD) teams. The programme is designed to replace the Army’s fleet of Wheelbarrow Mk8B remote-controlled EOD robots which have been used across the globe by UK Armed Forces since 1972.

Lt Col Thornton Daryl Hirst, Section Head of Remote Controlled Vehicles within DE&S’ Special Projects Search and Countermeasures team, said: “The first four production standard vehicles have been delivered early to the British Army enabling us to conduct train-the-trainer packages from January onwards. The hard work and dedication of my team has helped ensure that this critical project has run to time and cost and the trials exceeded our performance expectations.”

The Harris T7s robots use advanced haptic feedback to allow operators to ‘feel’ their way through the intricate process of disarming from a safe distance, protecting UK soldiers from threats such as roadside bombs. The haptic feedback function is designed to provide operators with human-like dexterity while they operate the robot’s arm using the remote-control handgrip. The unit gives the operator physical feedback, allowing intuitive detailed control.

All 56 robots are due to be delivered to the UK and in service by December 2020.

The post British Army Receives Bomb Disposal UGVs appeared first on Unmanned Systems Technology.

FLIR Announces Automotive Development Kit for Autonomous Vehicles

$
0
0

FLIR autonomous vision cameraFLIR Systems has announced the launch of its next-generation thermal vision Automotive Development Kit (ADK) for the development of self-driving cars.

FLIR has also unveiled a thermal enhanced self-driving test vehicle that demonstrates how thermal cameras improve the safety of advanced driver-assistance systems (ADAS) and fill performance gaps in the autonomous vehicles (AV) of tomorrow. The thermal autonomous test vehicle will be demonstrated at the 2019 Consumer Electronics Show (CES) at the Las Vegas Convention Center.

The next-generation thermal-vision ADK featuring the high-resolution FLIR Boson thermal camera core is designed to help automakers, tier-one automotive suppliers, and automotive innovators improve the safety of ADAS and self-driving vehicles. Paired with machine-learning algorithms for object classification, the ADK provides critical data from the far infrared portion of the electromagnetic spectrum to improve the decision making of AVs in common environments where other sensors experience challenges, such as darkness, shadows, sun glare, fog, smoke, or haze. The thermal-vision ADK augments the entire sensor suite and offers the redundancy needed to improve safety in AVs.

The new ADK is IP67 rated and includes an integrated heated window for improved performance in all-weather testing. It also features Gigabit Multimedia Serial Link (GMSL), USB, and Ethernet connection for easier integration.

FLIR is also showing the industry’s first thermal camera-equipped commercial test vehicle featuring multiple FLIR ADK cameras that will provide a 360-degree street view. The car demonstrates the ADK’s integration capabilities with radar, LIDAR, and visible cameras found on autonomous test vehicles today. With thermal camera-enhanced automatic emergency braking (AEB), the car helps validate how thermal imaging with machine learning classification improves the functionality of AEB.

“For automated decision making on the roadway, thermal imaging cameras coupled with machine-learning capabilities provide the most effective method for pedestrian detection to save lives, particularly in cluttered environments or in poor visibility,” said Jim Cannon, President and CEO at FLIR. “Furthermore, the FLIR thermal-enhanced autonomous test vehicle demonstrates how thermal cameras can significantly improve urban, highway, and AEB performance and the overall safety of self-driving cars.”

The post FLIR Announces Automotive Development Kit for Autonomous Vehicles appeared first on Unmanned Systems Technology.

Velodyne to Showcase New LiDAR Products

$
0
0

Velodyne LiDAR autonomous vehicle

Velodyne Lidar has announced that it will introduce and demonstrate its breakthrough new LiDAR sensor technology at CES 2019 in the Las Vegas Convention Center. Velodyne will present product demonstrations showing how LiDAR is advancing vehicle autonomy, safety, and advanced driver assistance systems (ADAS).

“The new products we are unveiling at CES advance Velodyne’s leadership position in providing the smartest, most powerful lidar solutions for autonomy and driver assistance,” said Anand Gopalan, Ph.D., Chief Technology Officer (CTO) at Velodyne Lidar. “Delivering integrated hardware and software safety solutions is extremely valuable to automakers with the technologies seamlessly working together to provide breakthrough advanced driver assistance systems.”

At CES, live demonstrations of Velodyne’s lidar sensors will show their combination of long range, high resolution, and wide field of view. Velodyne partners also will lead in-booth presentations, unveil new technologies, and demonstrate lidar’s use in autonomy, marine, agriculture, and emerging industries.

Velodyne demonstrations at CES will include:

Velodyne Alpha Puck – The culmination of ten years of LiDAR development and learning from millions of road miles, the Alpha Puck is a sensor specifically made for autonomous driving and advanced vehicle safety at highway speeds. Designed for Level 4-5 autonomy, the sensor produces an image with the highest resolution data set in the world.

Velodyne Velarray – The Velarray’s range, resolution, and field of view facilitate robust object detection, allowing for longer braking distance and increased safety. Designed for seamless vehicle integration, this compact sensor generates a richly-detailed directional image, day or night. It can be concealed within roof lines, in bumpers, and behind windshields.

The Velodyne booth will also feature Velodyne’s Augmented Reality demonstration that allows people to experience how autonomous vehicles see the world.

“At CES, people can come to the Velodyne booth to experience how our intelligent lidar sensors are enabling autonomous vehicles on the road today,” said Mike Jellen, president and chief commercial officer of Velodyne Lidar. “They can see how Velodyne’s rich computer perception data helps determine the safest way to navigate and direct a self-driving vehicle. Visitors to our booth can also learn how Velodyne’s versatile lidar sensors are utilized in a myriad of trailblazing applications in addition to self-driving cars and driver assistance, including unmanned aerial vehicles, mapping, industrial safety, robotics, security, and more.”

Velodyne will present products and presentations from its network of customers and partners that are using lidar technology in a range of innovative solutions. These partners include Accur8vision, AGC, Apex.AI, AutonomouStuff, DeepMap, Local Motors, MechaSpin, and Paracosm.

Accur8vision – Equipped with Velodyne lidar, Accur8vision is a volumetric detection system that protects an entire area needing to be secured, compared to perimetric detection which only guards the boundary.

AGC – A supplier of flat, automotive, and display glass, as well as chemicals and other high-tech materials and components, AGC will showcase windshield technology from its WIDEYE task force. Wideye is focused on autonomous vehicles and solid-state lidar integration solutions. Combined with the Velarray sensor for an interactive demo in Velodyne’s booth, WIDEYE’s infrared transparent automotive-grade glass provides an ADAS and autonomous solution featuring safer perception.

Apex.AI – Apex.AI builds reliable, safe, and certified software for autonomous vehicles and other autonomous mobility systems. Apex.OS is an SDK compatible ROS 2 (Robot Operating System). It provides a production-grade, safety-certified real-time framework for developing safe and secure autonomous vehicle applications. Apex.Autonomy provides functional building blocks for autonomous vehicles on top of Apex.OS, such as libraries for 3D lidar perception including integration of Velodyne lidars.

DeepMap – HD mapping is a crucial piece of the autonomous vehicle stack that needs to be robust, reliable, and highly scalable. DeepMap provides state-of-the-art mapping and localization to autonomous vehicles as a service. DeepMap helps its customers expedite their autonomous vehicle technology development in a safe and scalable way. Velodyne’s lidar is widely used by DeepMap and its customers for autonomous driving as well as mapping and localization.

Local Motors by LMI – Local Motors will show the world’s first co-created, electric, and self-driving shuttle, Olli. On display will be a current R&D prototype made of a nearly 90 percent 3D-Printed Olli and integrates a range of Velodyne sensors. Local Motors partnered with Velodyne to showcase how Velodyne sensors allow Olli to not only see in 360 degrees, but also ensures coverage of multiple overlapping areas at greater distance with more reliability.

MechaSpin – A lidar sensor integrator, MechaSpin will showcase how it has utilized Velodyne’s lidar technology to develop an ecosystem of capabilities to provide solutions in the maritime, intermodal, agriculture, and material handling industries. MechaSpin’s proprietary MSx Processing Engine enables rapid adoption and integration of lidar sensor technology for custom applications.

Paracosm – developer of PX-80, a handheld 3D mapping device that captures large-scale indoor and outdoor spaces in minutes using Velodyne’s Puck sensor.

The post Velodyne to Showcase New LiDAR Products appeared first on Unmanned Systems Technology.

Cyber Security Solution Demonstrated on Autonomous Vehicle Platform

$
0
0

Autonomous car cybersecurity

Argus Cyber Security has announced that it is demonstrating its cyber security solution on the NVIDIA DRIVE computing platform for autonomous vehicles at the CES 2019 trade show.

Argus is working closely with NVIDIA to add layers of cyber security defense to NVIDIA DRIVE, an energy-efficient, high-performance AI computing platform designed to safely enable autonomous vehicles of all types. The rapid development of self-driving cars, along with the growing number of connected services, has increased the potential for vehicle cyber-attacks. With self-driving cars quickly becoming a reality, and the number and diversity of connected automotive technologies and services increasing, cyber threats to vehicles are growing. Argus’ Connectivity Protection, coupled with its Argus Lifespan Protection, can offer vital cyber security protections.

“As a leader in the autonomous driving market, NVIDIA is accelerating the pace of self-driving technology and must ensure it remains secure in the face of multiplying and evolving cyber risks,” said Yoni Heilbronn, CMO, Argus Cyber Security. “Our industry-leading automotive cyber security expertise and innovative solutions will protect self-driving vehicles from the cyber threats of today and tomorrow.”

Argus Connectivity Protection prevents malware installation, detects operating system anomalies, isolates suspicious activity and stops attacks from spreading to the in-vehicle network. Argus Lifespan Protection enables automakers and fleet managers to continuously monitor the cyber health of their vehicles in the cloud, provides big data analytics to identify patterns and emerging attacks, and future-proofs vehicles through over-the-air security updates. The integrated solution makes it possible for automakers to seamlessly embed crucial cyber security measures without affecting production cycles or increasing project risk.

“Safety is our highest priority. We’re building a best-in-class ecosystem to address current and future safety issues, including cyber threats,” said Rishi Dhall, vice president of automotive business development at NVIDIA. “Argus’ demonstration on the NVIDIA DRIVE platform adds yet another layer of defense to our ongoing efforts to thwart potential cyber security attacks on autonomous vehicle platforms.”

The post Cyber Security Solution Demonstrated on Autonomous Vehicle Platform appeared first on Unmanned Systems Technology.

Construction Robot Prototype Built for Army Corps of Engineers

$
0
0

CMU robot prototype

Carnegie Mellon University has announced that its National Robotics Engineering Center (NREC) has constructed the largest robot in the 22-year history of the organization. Its 45-foot-tall gantry, visible from Pittsburgh’s 40th Street Bridge, was built as part of a U.S. Army Corps of Engineers prototyping project to automate its annual mat-sinking operations on the Mississippi River. The massive mats, which consist of concrete blocks wired together, shield riverbanks from erosion, helping to protect levees and ensure safe river navigation.

The prototype robot being built on NREC’s front lawn will serve to test and further develop systems that will become part of the final, much larger robot – a floating factory called ARMOR 1 – that eventually will be deployed on barges on the Mississippi.

The NREC gantry supports a 55-foot-long, 24-ton arm that is about 20 feet above the ground. A carriage suspended from the arm will have two hoists for picking up, transporting and positioning concrete blocks so they can be tied together with wire to create the mats. Each concrete “square” is 25 feet-long, four-feet-wide and three inches thick and weighs 3,600 pounds.

A deck has been installed for moving four rows of concrete blocks as they are tied together; in the final, deployed robot, the conveyance system also will launch the completed mats into the river.

An automated mat-tying system, now being built and tested inside the NREC building in Pittsburgh’s Lawrenceville section, will be added to the outdoor assemblage in mid-2019.

The test system is an unconventional robot and bigger than any previous NREC projects, including a system for the U.S. Air Force to remove coatings from aircraft using three-story-high, laser-equipped mobile robots, and an autonomous mine truck for Caterpillar.

The Corps of Engineers’ ARMOR 1 final prototype robot will dwarf this current test system. It will have six of the 55-foot arms for moving concrete squares. The assembly barge will measure 180 feet long, 75 feet wide and 45 feet high, said Gabriel Goldman, technical lead for the project at NREC. It will produce mats with 35 rows of concrete squares.

“And that’s just the barge with the arms,” he added, noting the system will include several barges. “When you zoom out, this thing is massive – and it’s all floating.”

Mat sinking takes place during the low-water months of August through December and is very labor intensive. Four gantry cranes are used to move concrete blocks from supply barges to a work barge where workers wire the mat together using pneumatic tools. As the mat is being assembled, the work barge inches away from shore to launch the mat along the sloping river banks. The Mat Sinking Unit, which has been in operation for 70 years, is crucial in preventing erosion to the riverbank of the Mississippi River, a vital commercial waterway that drains 41 percent of the nation’s water.

The goal of the automated system is to increase the amount of mat that can be assembled and launched each day, while improving worker safety and reducing operating costs. The new system will add technical skilled jobs such as robotic control operators to the ARMOR 1 workforce.

NREC is a subcontractor to SIA Solutions, and is working with Bristol Harbor Group on the ARMOR 1 project. NREC is responsible for designing the robotic system to automate the entire mat assembly and launching process.

In the latest phase of the project, NREC researchers will use the newly built robot to test each part of the process. That includes automatically picking up, moving and positioning the concrete blocks, as well as detecting when blocks are broken or otherwise defective. They also will be testing methods for automatically tying the mats together.

NREC is scheduled to finish its work by spring of 2020. The full-scale robotic system, to be deployed in 2021, will be built by another contractor.

The post Construction Robot Prototype Built for Army Corps of Engineers appeared first on Unmanned Systems Technology.

GNSS Positioning & Correction Developed for Autonomous Driving

$
0
0

Septentrio and Sapcorda GNSS solution for autonomous driving

Septentrio, a developer of high-precision GNSS technology, and Sapcorda, a provider of safe broadcasted GNSS correction services, have announced that the two firms are performing live demonstrations of a safe high-accuracy positioning and correction solution for automated driving.

The companies have combined their respective technologies to deliver the benefits of SSR (Space State Representation) technology seamlessly to OEM automakers and Tier1 integrators. These benefits include decimeter-accuracy within seconds, anywhere over an entire continent, to support autonomy levels from lane keeping to full autonomy in a totally homogeneous coverage. The GNSS augmentation service is scalable through simple broadcast corrections, and safety-awareness is provided via Sapcorda’s unique integrity concept and Septentrio’s integrity monitoring engine.

“We are excited to be able to provide live demonstrations of Sapcorda’s safe and precise correction service especially designed for autonomous driving,” said Jan Van Hees, Business Development Director at Septentrio. “Sapcorda provides a unique high-precision GNSS correction service designed for fast, homogeneous accuracy at continental coverage, thus ideal for autonomous and mass market applications.”

“Septentrio specializes in high-precision & high reliability GNSS positioning for a variety of industrial and commercial markets. They have developed a range of technologies, including unique jamming robustness and integrity positioning to support safety-sensitive applications in various challenging environments. Combining this with our safety-centered correction service it is a unique solution for developers of autonomous driving systems,” said Goran Jedrejčić, Business Development Manager at Sapcorda.

“With fast and efficient implementation of Sapcorda SSR-based correction service into Septentrio’s GNSS-platform, we were able to demonstrate the efficiency of the technology for automotive use in a robust & highly efficient way,” continued Jedrejčić. “Septentrio offers a unique blend of GNSS-based technologies and is an ideal partner for both traditional and new markets, with growing demand for high-precision positioning.”

Septentrio also recently announced its new mosaic compact multi-constellation GNSS Receiver SiP (System-in-Package) module. The Septentrio mosaic, a multi-band, multi-constellation receiver in a low-power surface-mount module with a wide array of interfaces, is designed for mass market applications like robotics and autonomous systems such as drones. The mosaic module integrates the latest GNSS and RF ASIC technology, as well as the robust positioning engine from Septentrio.

All Septentrio GNSS receivers and modules feature advanced AIM+ on-board interference mitigation technology. Septentrio GNSS receivers can suppress the widest variety of interferers, from simple continuous narrowband signals to the most complex wideband and pulsed jammers.

The post GNSS Positioning & Correction Developed for Autonomous Driving appeared first on Unmanned Systems Technology.


AI-Based Control Unit Enables Autonomous Driving Applications

$
0
0

Xilinx and ZF automotive AI platform

Xilinx and automotive technology developer ZF Friedrichshafen AG (ZF) have announced a new strategic collaboration in which Xilinx technology will power ZF’s highly-advanced artificial intelligence (AI)-based automotive control unit, called the ZF ProAI, to enable automated driving applications.

ZF is using the Xilinx Zynq UltraScale+ MPSoC platform to handle real-time data aggregation, pre-processing, and distribution, as well as to provide compute acceleration for the AI processing in ZF’s new AI-based electronic control unit. ZF selected this adaptable, intelligent platform because it provides the processing power, scalability and flexibility essential for the ZF ProAI platform to be customized for each of its customers’ unique requirements.

“The unique selling proposition of the ZF ProAI is its modular hardware concept and open software architecture. Our aim is to provide the widest possible range of functions in the field of autonomous driving,” explained Torsten Gollewski, head of ZF Advanced Engineering and general manager of Zukunft Ventures GmbH. This approach is unique compared to other systems on the market, which use a fixed combination of hardware and software architecture – a solution that can potentially limit functionality and add more cost.

“We are proud to partner with ZF on its ProAI platform and help solve the challenges associated with autonomous vehicle development,” said Yousef Khalilollahi, vice president, core vertical markets, Xilinx. “By providing an adaptable hardware platform, ZF can design flexible and scalable systems that seamlessly incorporate AI compute acceleration and functional safety (FuSa) through diversity in processing engines. We look forward to expanding our collaboration with ZF to take autonomous and AI innovation to the next level.”

Xilinx has been selling chips to automakers and Tier 1 automotive suppliers for over 12 years. More than 160 million Xilinx devices are in automotive systems today, and approximately 55 million of these are used for ADAS alone.

The post AI-Based Control Unit Enables Autonomous Driving Applications appeared first on Unmanned Systems Technology.

US Army TALON UGV Fleet to be Upgraded

$
0
0

US Army TALON UGV

QinetiQ North America (QNA) has announced that it has been awarded a $90 million contract to support the overall sustainment actions of the Tactical Adaptable Light Ordnance Neutralization (TALON) family of robotic systems for the US Army. QNA will be providing ongoing maintenance, upgrades and servicing of the Army’s existing, fielded fleet of TALON unmanned ground vehicles (UGVs).

Over 4,000 TALON robots are now deployed around the world by the U.S. and their allies. They are used primarily to assist military personnel with the extremely dangerous job of detecting and disabling roadside bombs or Improvised Explosive Devices (IEDs) planted by hostile forces to attack troops. TALON robots have been used in more than 80,000 counter-IED missions to date.

“We are proud that TALON continues to be recognized as a robot that is vital to the U.S. military for EOD and counter-IED missions because of its combat proven ruggedness, ease of use, advanced flexible architecture and reliability,” said Jeff Yorsz, president and CEO of QinetiQ North America. “Helping to keep U.S. Soldiers and Marines safe is what drives us and we continue to maintain and improve the TALON product line in support of that goal.”

The contract was awarded by the U.S. Army Contracting Command Warren located at Detroit Arsenal (ACC-Warren) on behalf of PM Force Projection’s Robotic Logistics Support Center (RLSC) which provides fielded robotics hardware across the operational spectrum.

The post US Army TALON UGV Fleet to be Upgraded appeared first on Unmanned Systems Technology.

Fleet of Autonomous Delivery Robots Commences Operations

$
0
0

Sodexho delivery robots

Sodexo, Inc. and Starship Technologies have announced the launch of robot food delivery services at George Mason University’s Fairfax, VA campus. Mason’s 40,000 students, faculty and staff can access the Starship Deliveries app (iOS and Android) to order food and drinks to be delivered anywhere on campus, within minutes.

With a fleet of more than 25 robots at launch, this initiative is the largest implementation of autonomous robot food delivery services on a university campus and is representative of Sodexo’s next-generation technology portfolio for the College and University Market. Sodexo, a global provider of food and facilities management services, is committed to developing and offering innovative services that enhance the campus experience and meet the expectations of today’s students. The service works in conjunction with student meal plans. By making food and drink more accessible, Sodexo and Starship are aiming to make the hectic, on-the-go lives of Mason students and faculty a little easier.

“We’re excited that our students, faculty and staff get to be at the forefront of this pioneering campus food delivery service,” said Mark Kraner, Executive Director for Campus Retail Operations at George Mason University. “This will enhance life for everyone at the University, and that’s something we’re continuously looking to build upon. Our commitment to providing an optimal campus experience is one of the things that distinguishes George Mason University as a place where everyone can thrive.”

To get started, users open the Starship Deliveries app, choose from a range of their favorite food or beverage items, then drop a pin where they want their delivery to be sent. They can then watch as the robot autonomously makes its journey to them, via an interactive map. Once the robot arrives, they receive an alert, and can then meet the robot and unlock it through the app. The entire delivery usually takes 15 minutes or less, depending on the menu items ordered and the distance the robot must travel. Each robot can carry up to 20 lbs – the equivalent of about three shopping bags of goods.

“College students understand the benefits of technology on campus and expect it to be integrated into their daily lives,” said Ryan Tuohy, SVP, Business Development, Starship Technologies. “With the hectic schedules students lead, there is a convenience for students to have their food, groceries and packages delivered. Our goal is to make life a little bit easier for students, whether that means skipping the line, eating lunch on the lawn rather than in the cafe, or finding the time to eat better when studying for exams. Commuter students can even meet the robot on their way into class. We look forward to seeing how our service will help and support the daily lives and community of students and educators at George Mason University.”

Starship Technologies is an autonomous delivery service that operates commercially on a daily basis around the world. Their robots have completed over 25,000 deliveries and travelled more than 150,000 miles. The robots use sophisticated machine learning, artificial intelligence, and arrays of sensors to seamlessly travel on sidewalks and navigate around obstacles. The computer vision-based navigation uses proprietary technology to provide precision in telemetry to the nearest inch. The robots can cross streets, climb curbs, travel at night and operate in both rain and snow. In addition, the robots can be stored in pods located around campus where their batteries are automatically switched so they can continue to operate independently, with no human involvement.

“University dining programs are evolving their strategies to meet this generation’s elevated expectations, such as better quality, variety and service delivery,” said Jim Jenkins, CEO, Universities East, Sodexo North America. “George Mason University’s culture of innovation and early adoption makes it the perfect campus for Sodexo and Starship to introduce this cutting-edge technology and enhance the campus experience for the entire school community.”

The post Fleet of Autonomous Delivery Robots Commences Operations appeared first on Unmanned Systems Technology.

FLIR to Acquire Endeavor Robotics

$
0
0

Endeavor Robotics unmanned ground vehicle

FLIR Systems has announced that it has entered into a definitive agreement to acquire Endeavor Robotic Holdings, Inc., a developer of battle-tested, tactical unmanned ground vehicles (UGVs) for the global military, public safety, and critical infrastructure markets. Endeavor’s highly-mobile and easy to operate ground robots utilize advanced sensing and actuation in providing explosive ordnance disposal, reconnaissance, inspection, and hazardous materials support for troops, police, and industrial users at stand-off range. Along with the recent acquisition of Aeryon Labs, FLIR has significantly expanded its unmanned systems capabilities.

Based outside Boston and formerly known as iRobot Defense & Security, Endeavor has shipped more than 7,000 UGVs to customers in over 55 countries. Their robots have been deployed in numerous applications, including by police and SWAT teams, by first responders, and at nuclear power and industrial plants. Endeavor is one of the largest UGV providers to the United States (U.S.) Department of Defense (DoD), and a key supplier of unmanned systems for the accelerating modernization of global military and law enforcement operational assets. Having recently been awarded the U.S. Army’s Man Transportable Robotic System Increment II (MTRS Inc II) contract, Endeavor is a major participant in several programmatic opportunities with the U.S. DoD and its ally militaries worldwide.

“The acquisition of Endeavor Robotics, coupled with previous acquisitions of Aeryon Labs and Prox Dynamics, has positioned FLIR as a leading unmanned solutions provider and advances the strategy we detailed at our Investor Day last year,” said Jim Cannon, President and CEO of FLIR Systems. “This acquisition aligns with our evolution from sensors to intelligent sensing and ultimately solutions that save lives and livelihoods. Endeavor’s momentum with the U.S. DoD and other global defense and police forces provides us significant opportunity to participate in long-term franchise programs and will help us create growth for the company.”

Having the largest deployed fleet of tactical robotics systems in the world, Endeavor has over 30 years of proven experience in advanced ruggedized UGVs. The company’s family of UGVs covers a broad spectrum of robot weight classes, from the five-pound throwable FirstLook reconnaissance robot to the 500-pound Kobra heavy-lift robot, and all models are controllable under one common command and control system. Endeavor’s modular approach to design also allows for varying payloads and sensor systems for the needs of each customer, including imaging and reconnaissance, vehicle and room inspection, bomb disposal, hazardous materials detection and disposal, radiation monitoring, and route clearance.

“Joining forces with FLIR Systems will allow Endeavor to take its life-saving technology to the next level, so we can better serve the warfighter, the police officer, and the first responder who use our robots on the frontlines every day,” said Sean Bielat, CEO of Endeavor. “We’re excited to be part of a company that sees its strategic growth in unmanned systems and will invest in our products, platforms and people. It is a tremendous fit and we look forward to the mission ahead.”

Upon closing of the acquisition, Endeavor will be part of the FLIR Government and Defense Business Unit’s Unmanned Systems and Integrated Solutions division.

The post FLIR to Acquire Endeavor Robotics appeared first on Unmanned Systems Technology.

World’s First Anti-Tank UGV Announced

$
0
0

Milrem Anti-Tank UGV

MBDA and Milrem Robotics have announced that the two firms are showcasing the world’s first anti-tank unmanned ground vehicle (UGV) at IDEX 2019, the main defence industry event in the MENA region. The debut of the system’s advanced concept comes eight months after Milrem Robotics and MBDA announced the start of feasibility studies of the system.

The joint project integrates the Milrem Robotics THeMIS unmanned ground vehicle with the MBDA IMPACT (Integrated MMP Precision Attack Combat Turret) system fitted with two MMP 5th generation battlefield engagement missiles and a self-defence machine gun. “This combination of two of the most modern technologies in their field is a very good example how robotic warfare systems will bring disruption to the battlefield and make some traditional technologies obsolete,” said Kuldar Väärsi, CEO of Milrem Robotics. “Our unmanned land combat system under study together with MBDA will be very efficient in keeping our troops safe and significantly increasing the capability to fight main battle tanks as well as any other ground target,” Väärsi added.

The land combat warfare system is intended to be remotely operated. Soldiers can deploy it while remaining at a safe distance and using a wireless or a tethered connection. The system will have a low heat and noise signature so it can stay unnoticed until completing its mission.

“Being delivered to the French Army since the end of 2017, the MMP system is now deployed by the French forces in theatre, where it replaces the Milan and Javelin missiles. With fully digitalized functions of observation, targeting, positioning and guidance, the MMP system is perfectly suited for integration on vehicles, including remotely operated ones” said Francis Bordachar, Military Advisor Land Products at MBDA.

The post World’s First Anti-Tank UGV Announced appeared first on Unmanned Systems Technology.

ECA Group Unveils Tactical Man-Portable UGV

$
0
0

ECA Group CAMELEON LG UGV

ECA Group has announced the launch of a new lightweight, compact and rugged unmanned ground vehicle (UGV) based on its CAMELEON UGV platform. The CAMELEON LG has been designed to meet the operational needs of military and security personnel.

The CAMELEON LG is designed to be carried in a backpack in addition to the standard equipment of a deployed infantry soldier without overloading the operator. The system is designed to improve the capabilities of a unit deployed on the field without slowing down or hindering it in its tactical movements.

Lightweight and versatile (12kg without payload, 15 kg with) the CAMELEON LG can be equipped with a manipulator arm (lifting capacity of 4.5 kg) or sensors (laser range finder, radiological sensor, chemical sensor, thermal camera, etc. ). It features an operation range of up to 500 meters and the ability to climb 45-degree slopes.

Resistant to dust, water and shock (up to IP65), it can be thrown during its deployment to limit the vulnerability of the operator or to reach inaccessible places such as windows, balconies, low walls and cliffs. The CAMELEON LG’s rugged construction also allows it to be loaded into all types of military vehicles without the need for a special storage container.

Operational in only 3 minutes, deployment of the CAMELEON LG is simple and fast, meeting the responsiveness needs of a tactical mission. Its high-resolution cameras, light weight, small size, agility and off-road capabilities make it an ideal robot for inspecting terrain, culverts, homes, caches or undersides of vehicles in search of IEDs, suspicious packages or hidden triggering devices.

With an intuitive design and ergonomic touch screen and buttons, the system controls allow the operator to direct the robot, interpret received images and data, and manipulate the arm and sensors. An auto-diagnostic maintenance system helps reduce the operator’s workload. The control station is also resistant to shocks, water and dust.

The CAMELEON LG has an endurance capability of 4 hours on the move and 15 hours in stand-by mode.

The post ECA Group Unveils Tactical Man-Portable UGV appeared first on Unmanned Systems Technology.

FLIR Systems Completes Acquisition of Endeavor Robotics

$
0
0

Endeavor Robotics UGV

FLIR Systems has announced that it has completed its previously announced acquisition of Endeavor Robotic Holdings, Inc., a developer of battle-tested, tactical unmanned ground vehicles (UGVs) for the global military, public safety, and critical infrastructure markets, from Arlington Capital Partners for $382 million in cash.

Based outside Boston and formerly known as iRobot Defense & Security, Endeavor has shipped more than 7,000 UGVs to customers in over 55 countries. Endeavor’s highly-mobile and easy to operate ground robots utilize advanced sensing and actuation in providing explosive ordnance disposal, reconnaissance, inspection, and hazardous materials support for troops, police, and industrial users at stand-off range. Endeavor is one of the largest UGV providers to the United States (U.S.) Department of Defense (DoD), and a key supplier of unmanned systems for the accelerating modernization of global military and law enforcement operational assets.

“With the addition of Endeavor, FLIR becomes a leading provider of unmanned aerial and ground solutions to support the needs of warfighters, and public safety and critical infrastructure professionals,” said Jim Cannon, President and CEO of FLIR Systems. “We are pleased to welcome Endeavor to the FLIR team and look forward to driving our mission to innovate technologies that help save lives and livelihoods.”

Endeavor is now part of the FLIR Government and Defense Business Unit’s Unmanned Systems and Integrated Solutions division.

The post FLIR Systems Completes Acquisition of Endeavor Robotics appeared first on Unmanned Systems Technology.


QinetiQ North America to Provide Small UGVs to U.S. Army

$
0
0

QNA Common Robotic System-Individual UGV

QinetiQ North America (QNA) has announced that it has won the competition for the U.S. Army’s Common Robotic System-Individual (CRS(I)) program. The seven-year Indefinite Delivery Indefinite Quantity (IDIQ) contract is for the delivery of small unmanned ground vehicles (UGVs). It includes a Low Rate Initial Production (LRIP) phase worth approximately $20m over one-to-two years, followed by a series of annual production releases. QNA has been awarded an initial order as part of the LRIP phase.

The CRS(I) robot is designed to be back-packable and is equipped with advanced sensors and mission modules for dismounted forces to enhance mission capabilities. CRS(I) features an interoperability profile (IOP) compatible open architecture to support a variety of payloads and missions to detect, identify, and counter hazards.

This significant win for small ground robots builds on QNA’s recent contract awards for the Route Clearance Interrogation System program (RCIS) and Phase II of the Common Robotic System-Heavy program (CRS-H).

“Providing robust, reliable, and exceptionally capable ground robots to support our armed services has been a driving passion at QNA for decades now,” said Jeff Yorsz, President of QinetiQ North America. “Our CRS(I) robot combines performance, intuitive control, and easy transport with a very competitive price point. This will redefine the market for next-generation back-packable robots.”

The post QinetiQ North America to Provide Small UGVs to U.S. Army appeared first on Unmanned Systems Technology.

NVIDIA Unveils New Autonomous Driving Simulation Platform

$
0
0

NVIDIA Drive Constellation

NVIDIA has announced the launch of its new NVIDIA DRIVE Constellation autonomous vehicle simulation platform. The cloud-based platform enables millions of miles to be driven in virtual worlds across a broad range of scenarios — from routine driving to rare and dangerous situations — with greater efficiency, cost-effectiveness and safety than what is possible to achieve in the real world.

DRIVE Constellation is a data center solution comprised of two side-by-side servers. One server — DRIVE Constellation Simulator — uses NVIDIA GPUs running DRIVE Sim software to generate the sensor output from the virtual car driving in a virtual world. The other server — DRIVE Constellation Vehicle — contains the DRIVE AGX Pegasus AI car computer, which processes the simulated sensor data.

The driving decisions from DRIVE Constellation Vehicle are fed back into DRIVE Constellation Simulator, enabling bit-accurate, timing-accurate hardware-in-the-loop testing.

Simulation will become a key component for third-party and regulatory autonomous vehicle standards. Safety agencies such as TÜV SÜD are already using the platform to formulate their self-driving validation standards.

“TÜV SÜD is looking for simulation tools that are trustworthy, robust and scalable for the approval of autonomous vehicles,” said Houssem Abdellatif, global head of Autonomous Driving and ADAS at TÜV SÜD. “NVIDIA DRIVE Constellation provides a powerful and highly scalable solution to achieve this goal.”

NVIDIA has also announced that Toyota Research Institute-Advanced Development (TRI-AD) is the first customer of DRIVE Constellation. “We believe large-scale simulation tools for software validation and testing are critical for automated driving systems,” said Dr. James Kuffner, CEO of TRI-AD.

At the GPU Technology Conference (GTC), NVIDIA founder and CEO Jensen Huang demonstrated the scalability of the DRIVE Constellation platform seamlessly performing driving tests in the cloud. Developers anywhere in the world can submit simulation scenarios to DRIVE Constellation data centers and evaluate the results from their desks.

This large-scale validation capability is comparable to operating an entire fleet of test vehicles, however, it is able to accomplish years of testing in a fraction of the time.

DRIVE Constellation is an open platform into which ecosystem partners can integrate their environment models, vehicle models, sensor models and traffic scenarios. By incorporating datasets from the broader simulation ecosystem, the platform can generate comprehensive, diverse and complex testing environments.

To this end, Cognata, a simulation company, announced that its scenario and traffic model can be supported on DRIVE Constellation. With Cognata’s traffic models, developers can define a number of vehicles and other road users, as well as their behavior, based on real-world traffic behavior.

“Cognata and NVIDIA are creating a robust solution that will efficiently and safely accelerate autonomous vehicles’ market entry,” said Danny Atsmon, CEO of Cognata. “Highly accurate and scalable traffic model simulation technology is essential to validate autonomous vehicle systems with very large combinations of real-world scenarios.”

IPG Automotive, an automotive simulation company, is another ecosystem partner working with NVIDIA to enable a high-fidelity vehicle models. Its simulation software, CarMaker, is used to create virtual vehicle prototypes, including models of all main vehicle subsystems. Developers can include test vehicle responses to changes in steering, road surface, suspension, powertrain and vehicle control systems for function development.

“Together with the support of our ecosystem partners, we’re making available large-scale, cloud-based, open simulation that thoroughly and safely validates self-driving cars under endless challenging situations,” said Zvi Greenstein, general manager at NVIDIA.

The post NVIDIA Unveils New Autonomous Driving Simulation Platform appeared first on Unmanned Systems Technology.

Velodyne Releases LiDAR Solutions for NVIDIA Autonomous Driving Platform

$
0
0

Velodyne LiDAR autonomous vehicle sensing

Velodyne Lidar has announced that its surround-view lidar solutions for collecting rich perception data in testing and validation are available on the NVIDIA DRIVE autonomous driving platform – allowing full, 360-degree perception in real time, facilitating highly accurate localization and path-planning capabilities.

Velodyne sensors’ characteristics are also available on NVIDIA DRIVE Constellation, an open, scalable simulation platform that enables large-scale, bit-accurate hardware-in-the-loop testing of autonomous vehicles. The solution’s DRIVE Sim software simulates lidar and other sensors, recreating a self-driving car’s inputs with high fidelity in the virtual world.

“Velodyne and NVIDIA are at the forefront delivering the high-resolution sensing and high-performance computing needed for autonomous driving,” said Mike Jellen, president and chief commercial officer of Velodyne Lidar. “As an NVIDIA DRIVE ecosystem partner, our intelligent lidar sensors are foundational to advance vehicle autonomy, safety, and driver assistance systems at leading global manufacturers.”

Find Suppliers of LiDAR Sensors for Autonomous Vehicles

Velodyne provides a broad portfolio of lidar solutions, which spans the full product range required for advanced driver assistance and autonomy by automotive OEMs, truck OEMs, delivery manufacturers, and Tier 1 suppliers. Proven through learning from millions of road miles, Velodyne sensors help determine the safest way to navigate and direct a self-driving vehicle. The addition of Velodyne sensors enhances Level 2+ advanced driver assistance systems (ADAS) features including Automatic Emergency Braking (AEB), Adaptive Cruise Control (ACC), and Lane Keep Assist (LKA).

“Velodyne’s lidar sensors help deliver the intelligence to enable automated driving systems and roadway safety by detecting more objects and presenting vehicles with more in-depth views of their surrounding environments,” said Glenn Schuster, senior director of sensor ecosystem development at NVIDIA.

The post Velodyne Releases LiDAR Solutions for NVIDIA Autonomous Driving Platform appeared first on Unmanned Systems Technology.

LiDAR-Equipped UGV Inspects Abandoned Mine

$
0
0

Milrem Robotics Multiscope UGV

Milrem Robotics has announced that its Multiscope commercial unmanned ground vehicle (UGV) has been utilized by Estonian energy company Enefit to determine the condition of the pillars in a mine in which mining ended ten years ago.

The Multiscope, equipped with 3D LiDAR sensors, was used in closed underground locations. As the anchor structure used to support the ceiling is removed after the end of mining operations, people are no longer allowed to enter those areas.

“Enefit uses smart and innovative solutions to make our production more efficient, to minimize our environmental impact, and prevent occupational safety-related risks. An unmanned vehicle is an excellent solution for conducting surveys in an area which may be unsafe or prohibited for people to enter,” said Veljo Aleksandrov, Development Project Director at Enefit. “The vehicle’s mission was successful – we obtained the three-dimensional images and other necessary data which are vital for building a solar power plant on top of the mine.”

Enefit is planning on building one of the largest solar power plants of the region on top of the abandoned mine, generating approximately 3,500 MWh of renewable electricity per year.

“The engineers of Milrem Robotics are working on different robotics solutions designed for the civil market. Above all, they are focusing on autonomous movement in complicated environments. Multiscope is a good example of how our technology helps to increase efficiency and complete complex tasks,” said Kuldar Väärsi, CEO of Milrem Robotics.

For this mission the Multiscope was piloted underground using data coming from LiDAR sensors and wide spectrum HDR cameras, however, autonomous operation could be applied in future projects.

The post LiDAR-Equipped UGV Inspects Abandoned Mine appeared first on Unmanned Systems Technology.

Clearpath Robotics Selects Velodyne Lidar Technology for Autonomous Navigation

$
0
0

Clearpath Robotics vehicle with Velodyne sensor

Velodyne Lidar has announced a partnership with Clearpath Robotics to provide its lidar sensors for the Clearpath research robot platform. Clearpath Robotics offers Velodyne sensors to academic and corporate research organizations on its mobile robots for survey and inspection, oil and gas, agriculture, materials handling and other applications.

Clearpath’s robotic solutions utilize Velodyne’s state-of-the-art lidar technology, which features improved resolution, range, and field of view. Velodyne sensors create a 360° real-time map of the environment, allowing the robot to detect and avoid obstacles for safe autonomous navigation.

“Velodyne delivers outstanding contributions to the robotics research community by providing reliable, high-quality 3D lidar,” said Julian Ware, General Manager of Clearpath Robotics. “We have been recommending and integrating Velodyne products on our robotic platforms for almost a decade. Velodyne sensors have proven to handle challenging automation tasks and flawlessly function in unfamiliar and unpredictable settings.”

Velodyne’s lidar sensors are designed for seamless integration with robotic platforms by being easy to mount, having low-power consumption, and including a web configuration tool. They are designed to perform in high-stress environmental conditions like inclement weather, which is essential for outdoor deployments. Velodyne provides comprehensive documentation and support for integration in all applications.

Clearpath provides value-added services for Velodyne with wide-ranging expertise integrating its sensors in customized robotics systems. The company has extensive experience supporting Robot Operating System (ROS)-ready mobile robotics platforms by developing and maintaining ROS drivers, and providing step-by-step ROS tutorials.

“Clearpath provides a research robot platform, equipped with Velodyne sensors, that is easy to use, easy to buy, and easy to get started with development,” said Mike Jellen, President and CCO, Velodyne Lidar. “Clearpath brings the skill set and experience necessary to help customers derive maximum value from high-resolution, 3D data provided by Velodyne lidar to create innovative ground-based mobile robot solutions.”

The post Clearpath Robotics Selects Velodyne Lidar Technology for Autonomous Navigation appeared first on Unmanned Systems Technology.

Viewing all 446 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>