Showing posts with label ADAS. Show all posts
Showing posts with label ADAS. Show all posts

Keeping it fresh for 35 years

By Megan Alink, Director of Marketing Communications for Automotive

Recently, my colleagues Paul Leroux and Matt Young showed off a shiny new infographic that enlightens readers to the many ways they encounter QNX-based systems in daily life (here and here). After three-and-a-half decades in business we’ve certainly been around the block a time or two, and you might think things are getting a bit stale. As the infographic shows, that couldn’t be further from the truth here at QNX. From up in the stars to down on the roads; in planes, trains, and automobiles (and boats too); whether you’re mailing a letter or crafting a BBM on your BlackBerry smartphone, the number and breadth of applications in which our customers deploy QNX technology is simply astounding.

For those who like some sound with their pictures, we also made a video to drive home the point that, wherever you are and whatever you do, chances are you’ll encounter a little QNX. Check it out:


“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

Hypervisors, virtualization, and taking control of your safety certification budget

A new webinar on how virtualization can help you add new technology to existing designs.

First things first: should you say “hypervisor” or “virtual machine monitor”? Both terms refer to the same thing, but is one preferable to the other?

Hypervisor certainly has the greater sex appeal, suggesting it was coined by a marketing department that saw no hope in promoting a term as coldly technical as virtual machine monitor. But, in fact, hypervisor has a long and established history, dating back almost 50 years. Moreover, it was coined not by a marketing department, but by a software developer.

“Hypervisor” is simply a variant of “supervisor,” a traditional name for the software that controls task scheduling and other fundamental operations in a computer system — software that, in most systems, is now called the OS kernel. Because a hypervisor manages the execution of multiple OSs, it is, in effect, a supervisor of supervisors. Hence hypervisor.

No matter what you call it, a hypervisor creates multiple virtual machines, each hosting a separate guest OS, and allows the OSs to share a system’s hardware resources, including CPU, memory, and I/O. As a result, system designers can consolidate previously discrete systems onto a single system-on-chip (SoC) and thereby reduce the size, weight, and power consumption of their designs — a trinity of benefits known as SWaP.

That said, not all hypervisors are created equal. There are, for example, Type 1 “bare metal” hypervisors, which run directly on the host hardware, and Type 2 hypervisors, which run on top of an OS. Both types have their benefits, but Type 1 offers the better choice for any embedded system that requires fast, predictable response times — most safety-critical systems arguably fall within this category.

The QNX Hypervisor is an example of a Type 1 “bare metal” hypervisor.


Moreover, some hypervisors make it easier for the guest OSs to share hardware resources. The QNX Hypervisor, for example, employs several technologies to simplify the sharing of display controllers, network connections, file systems, and I/O devices like the I2C serial bus. Developers can, as a result, avoid writing custom shared-device drivers that increase testing and certification costs and that typically exhibit lower performance than field-hardened, vendor-supplied drivers.

Adding features, without blowing the certification budget
Hypervisors, and the virtualization they provide, offer another benefit: the ability to keep OSs cleanly isolated from each other, even though they share the same hardware. This benefit is attractive to anyone trying to build a safety-critical system and reduce SWaP. Better yet, the virtualization can help device makers add new and differentiating features, such as rich user interfaces, without compromising safety-critical components.

That said, hardware and peripheral device interfaces are evolving continuously. How can you maintain compliance with safety-related standards like ISO 26262 and still take advantage of new hardware features and functionality?

Enter a new webinar hosted by my inimitable colleague Chris Ault. Chris will examine techniques that enable you to add new features to existing devices, while maintaining close control of the safety certification scope and budget. Here are some of the topics he’ll address:

  • Overview of virtualization options and their pros and cons
     
  • Comparison of how adaptive time partitioning and virtualization help achieve separation of safety-critical systems
     
  • Maintaining realtime performance of industrial automation protocols without directly affecting safety certification efforts
     
  • Using Android applications for user interfaces and connectivity

Webinar coordinates:
Exploring Virtualization Options for Adding New Technology to Safety-Critical Devices
Time: Thursday, March 5, 12:00 pm EST
Duration: 1 hour
Registration: Visit TechOnLine

Have you heard about Phantom Intelligence yet?

If you haven’t, I bet you will. Phantom Intelligence is a startup that is looking to revolutionize LiDAR for automotive. I hadn’t heard of them either until QNX and Phantom Intelligence found themselves involved in a university project in 2014. They had some cool technology and are just all-around good guys, so we started to explore how we could work together at CES 2015. One thing led to another and their technology was ultimately featured in both the QNX reference vehicle and the new QNX technology concept car.

I knew little about LiDAR at the beginning of the partnership. But as I started to ramp up my knowledge I learned that LiDAR can provide valuable sensor input into ADAS systems. Problem is, LiDAR solutions are big, expensive, and have not, for the most part, provided the kind of sensitivity and performance that automakers look for.

Phantom Intelligence is looking to change all this with small, cost-effective LiDAR systems that can detect not just metal, but also people (handy if you are crossing the street and left your Tin Man costume at home) and that are impervious to inclement weather. As a frequent pedestrian this is all music to my ears.

I am still in no way qualified to offer an intelligent opinion on the pros and cons of competing LiDAR technology so I’m just going on the positive feedback I heard from customers and other suppliers into the ADAS space at CES. Phantom turned out to be one of the surprise hits this year and they are just getting started. That’s why I think you will be hear more about them soon.


Both QNX vehicles showcased at CES 2015 use a LiDAR system from Phantom Intelligence to detect obstacles on the road ahead.

New to 26262? Have I got a primer for you

Driver error is the #1 problem on our roads — and has been since 1869. In August of that year, a scientist named Mary Ward became the first person to die in an automobile accident, after being thrown from a steam-powered car. Driver error was a factor in Mary’s death and, 145 years later, it remains a problem, contributing to roughly 90% of motor vehicle crashes.

Can ADAS systems mitigate driver error and reduce traffic deaths? The evidence suggests that, yes, they help prevent accidents. That said, ADAS systems can themselves cause harm, if they malfunction. Imagine, for example, an adaptive cruise control system that underestimates the distance of a car up ahead. Which raises the question: how can you trust the safety claims for an ADAS system? And how do you establish that the evidence for those claims is sufficient?

Enter ISO 26262. This standard, introduced in 2011, provides a comprehensive framework for validating the functional safety claims of ADAS systems, digital instrument clusters, and other electrical or electronic systems in production passenger vehicles.

ISO 26262 isn’t for the faint of heart. It’s a rigorous, 10-part standard that recommends tools, techniques, and methodologies for the entire development cycle, from specification to decommissioning. In fact, to develop a deep understanding of 26262 you must first become versed in another standard, IEC 61508, which forms the basis of 26262.

ISO 26262 starts from the premise that no system is 100% safe. Consequently, the system designer must perform a hazard and risk analysis to identify the safety requirements and residual risks of the system being developed. The outcome of that analysis determines the Automotive Safety Integrity Level (ASIL) of the system, as defined by 26262. ASILs range from A to D, where A represents the lowest degree of hazard and D, the highest. The higher the ASIL, the greater the degree of rigor that must be applied to assure the system avoids residual risk.

Having determined the risks (and the ASIL) , the system designer selects an appropriate architecture. The designer must also validate that architecture, using tools and techniques that 26262 either recommends or highly recommends. If the designer believes that a recommended tool or technique isn’t appropriate to the project, he or she must provide a solid rationale for the decision, and must justify why the technique actually used is as good or better than that recommended by 26262.

The designer must also prepare a safety case. True to its name, this document presents the case that the system is sufficiently safe for its intended application and environment. It comprises three main components: 1) a clear statement of what is claimed about the system, 2) the argument that the claim has been met, and 3) the evidence that supports the argument. The safety case should convince not only the 26262 auditor, but also the entire development team, the company’s executives, and, of course, the customer. Of course, no system is safe unless it is deployed and used correctly, so the system designer must also produce a safety manual that sets the constraints within which the product must be deployed.

Achieving 26262 compliance is a major undertaking. That said, any conscientious team working on a safety-critical project would probably apply most of the recommended techniques. The standard was created to ensure that safety isn’t treated as an afterthought during final testing, but as a matter of due diligence in every stage of development.

If you’re a system designer or implementer, where do you start? I would suggest “A Developer’s View of ISO 26262”, an article recently authored by my colleague Chris Hobbs and published in EE Times Automotive Europe. The article provides an introduction to the standard, based on experience of certifying software to ISO 26262, and covers key topics such as ASILs, recommended verification tools and techniques, the safety case, and confidence from use.

I also have two whitepapers that may prove useful: Architectures for ISO 26262 systems with multiple ASIL requirements, written by my colleague Yi Zheng, and Protecting software components from interference in an ISO 26262 system, written by Chris Hobbs and Yi Zheng.

Driving simulators at CES

CES was just 15 minutes from closing when I managed to slip away from the very busy QNX booth to try out an F1 simulator. Three screens, 6 degrees of freedom, and surround sound came together for the most exciting simulated driving experience I have ever had. I was literally shaking when they dragged me out of the driver’s seat (I didn’t want to stop :-). Mind you, at around $80K for the system, it seems unlikely I will ever own one.

The experience got me thinking about the types of vehicles currently in simulation or in the lab that I fully expect to drive in my lifetime: cars that are virtually impossible to crash, cars that make it painless to travel long distances, and, ultimately, cars that worry about traffic jams so I can read a book.

Re-incarnated: The QNX reference
vehicle.
QNX Software Systems had a very popular simulator of its own at CES this year. You may have seen some details on it already but to recap, it is a new incarnation of our trusty QNX reference vehicle, extended to demonstrate ADAS capabilities. We parked it in front of a 12 foot display and used video footage captured on California’s fabled Highway 1 to provide the closest thing to real-world driving we could create.

The resulting virtual drive showcased the capabilities not only of QNX technology, but of our ecosystem as well. Using the video footage, we provided camera inputs to Itseez’ computer vision algorithms to demonstrate a working example of lane departure warning and traffic sign recognition. By capturing GPS data synchronized with the video footage, and feeding the result through Elektrobit’s Electronic Horizon Solution, we were able to generate curve speed warnings. All this was running on automotive-grade Jacinto 6 silicon from Texas Instruments. LiDAR technology from Phantom Intelligence rounded out the offering by providing collision feedback to the driver.

The lane departure and curve speed warnings in action. Screen-grab from video by Embedded Computing Design.

Meeting the challenge
While at CES, I also had the opportunity to meet with companies that are working to make advanced ADAS systems commercially viable. Phantom Intelligence is one example but I was also introduced to companies that can provide thermal imaging systems and near-infrared cameras at a fraction of what these technologies cost today.

These are all examples of how the industry is rising up to meet the challenge of safer, more autonomous vehicles at a price point that allows for widespread adoption in the foreseeable future. Amazing stuff, really — we are finally entering the era of the Jetsons.

By the way, I can’t remember what booth I was in when I drove the simulator. But I’m willing to bet that the people who experienced the Jeep at CES will remember they were in the QNX booth, seeing technology from QNX and its key partners in this exciting new world.

Tom’s Guide taps QNX concept car with CES 2015 award

Have you ever checked out a product review on Tom’s Guide? If so, you’re not alone. Every month, this website attracts more than 2.5 million unique visitors — that’s equivalent to the population of Toronto, the largest city in Canada.

The folks at Tom’s Guide test and review everything from drones to 3D printers. They love technology. So perhaps it’s no surprise that they took a shine to the QNX technology concept car. In fact, they liked it so much, they awarded it the Tom’s Guide CES 2015 Award, in the car tech category.

To quote Sam Rutherford of Tom’s Guide, “After my time with QNX’s platform, I was left with the impression there’s finally a company that just “gets it” when it comes to the technology in cars. The company has learned from the success of modern mobile devices and brought that knowledge to the auto world…”.

I think I like this Sam guy.

Engadget was also impressed...
A forward-looking approach to seeing
behind you.
The Tom’s Guide award is the second honor QNX picked up at CES. We were also shortlisted for an Engadget Best of CES award, for the digital rear- and side-view mirrors on the QNX technology concept car.

If you haven’t seen the mirrors in action, they offer a complete view of the scene behind and to the sides of the vehicle — goodbye to the blind spots associated with conventional reflective mirrors. Better yet, the side-view digital mirrors have the smarts to detect cars, bicycles, and other objects, and they will display an alert if an object is too close when the driver signals a lane change.

In addition to the digital mirrors, the QNX technology concept car integrates several other ADAS features, including speed recommendations, forward-collision warnings, and intelligent parking assist. Learn more here.

Finalist for Engadget Best of CES Awards 2015

By Lynn Gayowski

*Fist pump!* The accolades from CES just keep coming. I'm excited to share the news that the digital mirrors implemented in our 2015 QNX technology concept car have been selected by Engadget as a finalist for their Best of CES Awards 2015, in the Best Automotive Technology category!

With advanced driver assistance systems (ADAS) influential in the design of this year's QNX vehicle, replacing the mirrors on the Maserati with digital screens to warn of possible collisions and enhance visibility for the driver was a natural choice.

Not only do the side-view screens eliminate blind spots, they also give a red warning overlay if an obstacle is in the way when making a lane change. If the coast is clear, the overlay is green.

The rear-view display is a wide-angle view behind the car that provides the driver with an expanded picture that's larger than what you'd see with a typical mirror.

Powered by the reliable QNX OS, these digital mirrors could be a feature that helps drivers of the future avoid accidents.

The rear- and side-view video displays in the 2015 QNX technology concept car based on a Maserati Quattroporte GTS offer a complete view behind and to the sides of the vehicle, eliminating blind spots.

If you're attending CES, check out the digital mirrors and the many other ADAS and infotainment demos in the QNX booth: North Hall, Booth 2231.


Now with ADAS: The revamped QNX reference vehicle

Tina Jeffrey
Since 2012, our Jeep has showcased what QNX technology can do out of the box. We decided it was time to up the ante...

I walked into the QNX garage a few weeks ago and did a double take. The QNX reference vehicle, a modified Jeep Wrangler, had undergone a major overhaul both inside and out — and just in time for 2015 CES.

Before I get into the how and why of the Jeep’s metamorphosis, here’s a glimpse of its newly refreshed exterior. Orange is the new gray!



The Jeep debuted in June 2012 at Telematics Detroit. Its purpose: to show how customers can use off-the-shelf QNX products, like the QNX CAR Platform for Infotainment and QNX OS, to build a wide range of custom infotainment systems and instrument clusters, using a single code base.

From day one, the Jeep has been a real workhorse, making appearances at numerous events to showcase the latest HMI, navigation, speech recognition, multimedia, and handsfree acoustics technologies, not to mention embedded apps for parking, internet radio streaming, weather, and smartphone connectivity. The Jeep has performed dependably time and time again, and now, in an era where automotive safety is top of mind, we’ve decided to up the ante and add leading-edge ADAS technology built on the QNX OS.

After all, what sets the QNX OS apart is its proven track record in safety-certified systems across market segments — industrial, medical, and automotive. In fact, the QNX OS for Automotive Safety is certified to the highest level of automotive functional safety: ISO 26262, ASIL D. Using a pre-certified OS component is key to the overall integrity of an automotive system and makes system certification much easier.

The ultimate (virtual) driving experience
How better to showcase ADAS in the Jeep, than by a virtual drive? At CES, a 12-foot video screen in front of the Jeep plays a pre-recorded driving scene, while the onboard ADAS system analyzes the scene to detect lane markers, speed signs, and preceding vehicles, and to warn of unintentional lane departures, excessive speed, and imminent crashes with vehicles on the road ahead. Onboard computer vision algorithms from Itseez process the image frames in real time to perform these functions simultaneously.

Here’s a scene from the virtual drive, in which the ADAS system is tracking lane markings and has detected a speed-limit sign:



If the vehicle begins to drift outside a lane, the steering wheel provides haptic feedback and the cluster displays a warning:



The ADAS system includes Elektrobit EB Assist eHorizon, which uses map data with curve-speed information to provide warnings and recommendations, such as reducing your speed to navigate an upcoming curve:



The Jeep also has a LiDAR system from Phantom Intelligence (formerly Aerostar) to detect obstacles on the road ahead. The cluster displays warnings from this system, as well as warnings from the vision-based collision-detection feature. For example:



POSTSCRIPT:
Here’s a short video of the virtual drive, taken at CES by Brandon Lewis of Embedded Computing Design, in which you can see curve-speed warnings and lane-departure warnings:



Fast-boot camera
Rounding out the ADAS features is a rear-view camera demo that can cold boot in 0.8 seconds on a Texas Instruments Jacinto 6 processor. As you may recall, NHTSA has mandated that, by May 2018, most new vehicles must have rear-view technology that can display a 10-by-20 foot area directly behind the vehicle; moreover, the display must appear no more than 2 seconds after the driver throws the vehicle into reverse. Backup camera and other fastboot requirements such as time-to-last-mode audio, time-to-HMI visible, and time-to-fully-responsive HMI are critically important to automakers. Be sure to check out the demo — but don’t blink or you’ll miss it!

Full-featured infotainment
The head unit includes a full-featured infotainment system based on the QNX CAR Platform for Infotainment and provides information such as weather, current song, and turn-by-turn directions to the instrument cluster, where they’re easier for the driver to see.



Infotainment features include:

Qt-based HMI — Can integrate other HMI technologies, including Elektrobit EB Guide and Crank Storyboard.

Natural language processing (NLP) — Uses Nuance’s Vocon Hybrid solution in concert with the QNX NLP technology for natural interaction with infotainment functions. For instance, if you ask “Will I need a jacket later today?”, the Weather Network app will launch and provide the forecast.

EB street director — Provides embedded navigation with a 3D map engine; the map is synched up with the virtual drive during the demo.

QNX CAR Platform multimedia engine — An automotive-hardened solution that can handle:
  • audio management for seamless transitions between all audio sources
  • media detection and browsing of connected devices
  • background synching of music for instant media playback — without the need for the synch to be completed

Support for all smartphone connectivity options — DLNA, MTP, MirrorLink, Bluetooth, USB, Wi-Fi, etc.

On-board application framework — Supports Qt, HTML5, APK (for Android apps), and native OpenGL ES apps. Apps include iHeart, Parkopedia, Pandora, Slacker, and Weather Network, as well as a Settings app for phone pairing, over-the-air software updates, and Wi-Fi hotspot setup.

So if you’re in the North Hall at CES this week, be sure to take a virtual ride in the QNX reference vehicle in Booth 2231. Beneath the fresh paint job, it’s the same workhorse it has always been, but now with new ADAS tech automakers are thirsting for.

Volkswagen and LG Gear up with QNX

Design wins put QNX technology in a wide range of infotainment systems, instrument clusters, and ADAS solutions.

Earlier today, QNX Software Systems announced that infotainment systems powered by the QNX Neutrino OS are now shipping in several 2015 Volkswagen vehicle models, including the Touareg, Passat, Polo, Golf, and Golf GTI.

The systems include the RNS 850 GPS navigation system in the Volkswagen Touareg, which recently introduced support for 3D Google Earth maps and Google Street View. The system also offers realtime traffic information, points-of-interest search, reverse camera display, voice control, Bluetooth connectivity, rich multimedia support, four-zone climate control, a high-resolution 8-inch color touchscreen, and other advanced features.

Bird's eye view: the RNS 850 GPS navigation system for the Volkswagen Touareg SUV. Source: VW

“At Volkswagen, we believe deeply in delivering the highest quality driving experience, regardless of the cost, size, and features of the vehicle,” commented Alf Pollex, Head of Connected Car and Infotainment at Volkswagen AG. “The scalable architecture of the QNX platform is well-suited to our approach, enabling us to offer a full range of infotainment systems, from premium level to mass volume, using a single, proven software base for our Modular Infotainment Modules (MIB) and the RNS 850 system.”

QNX and LG: a proven partnership
QNX also announced that LG Electronics’ Vehicle Components (VC) Company will use a range of QNX solutions to build infotainment systems, digital instrument clusters, and advanced driver assistance systems (ADAS) for the global automotive market.

The new initiative builds on a long history of collaboration between LG and QNX Software Systems, who have worked together on successful, large-volume telematics production programs. For the new systems, QNX will provide LG with the QNX CAR Platform for Infotainment, the QNX Neutrino OS, the QNX OS for Automotive Safety, and QNX Acoustics for Voice.

“QNX Software Systems has been our trusted supplier
for more than a decade... helping LG deliver millions
of high-quality systems to the world’s automakers”

— Won-Yong Hwang, LG's VC Company

“QNX Software Systems has been our trusted supplier for more than a decade, providing flexible software solutions that have helped LG deliver millions of high-quality systems to the world’s automakers,” commented Won-Yong Hwang, Director and Head of AVN development department, LG Electronics’ VC Company. “This same flexibility allows us to leverage our existing QNX expertise in new and growing markets such as ADAS, where the proven reliability of QNX Software Systems’ technology can play a critical role in addressing automotive safety requirements.”

Visit the QNX website to learn more about the Volkswagen and LG announcements.

To infotainment... and beyond! First look at new QNX technology concept car

The new car delivers everything you’d expect in a concept vehicle from QNX. But the real buzz can be summarized in a four-letter word: ADAS

The technology in today's cars is light-years ahead of the technology in cars 10 or 20 years ago. The humans driving those cars, however, have changed little in the intervening years. They still need to focus on a host of mundane driving tasks, from checking blind spots and monitoring road signs to staying within the lane and squeezing into parking spaces. In fact, with all the technology now in the car, including a variety of brought-in devices, some drivers suffer from information overload and perform worse, instead of better, at these crucial tasks.

Advanced driver assistance systems, or ADAS, can go a long way to offset this problem. They come in a variety of shapes and sizes — from drowsiness monitoring to autonomous emergency braking — but most share a common goal: to help the driver avoid accidents.

Which brings us to the new QNX technology concept car. As you’d expect, it includes all the advanced infotainment features, including smartphone connectivity and rich app support, offered by the QNX CAR Platform for Infotainment. But it also integrates an array of additional technologies — including cameras, LiDAR, ultrasonic sensors, and specialized navigation software — to deliver ADAS capabilities that simplify driving tasks, warn of possible collisions, and enhance overall driver awareness.

Mind you, the ADAS features shouldn’t come as any more of a surprise than the infotainment features. After all, QNX Software Systems also offers the QNX OS for Automotive Safety, a solution based on decades of experience in safety-critical systems and certified to ISO 26262, Automotive Safety Integrity Level D — the highest level achievable.

Okay, enough blather. Time to check out the car!

The “I want that” car
If the trident hasn’t already tipped you off, the new technology concept car is based on a Maserati QuattroPorte GTS. I won’t say much about the car itself, except I want one. Did I say want? Sorry, I meant lust. Because omigosh:



The differentiated dash
Before we run through the car’s many features, let’s stop for a minute and savor the elegant design of its QNX-powered digital instrument cluster and infotainment system. To be honest, I have an ulterior motive for sharing this image: if you compare the systems shown here to those of previous QNX technology concept cars (here, here, and here), you’ll see that they each project a distinct look-and-feel. Automakers need to differentiate themselves, and, as a group, these cars illustrate how the flexibility of the QNX platform enables unique, branded user experiences:



The multi-talented digital instrument cluster
Okay, let’s get behind the wheel and test out the digital cluster. Designed to heighten driver awareness, the cluster can show the current speed limit, display an alert if you exceed the limit, and even recommend an appropriate speed for upcoming curves. Better yet, it can display turn-by-turn directions provided by the car’s infotainment system.

Normally, the cluster displays the speed limit in a white circle. But in this image, the cluster displays it in red, along with a red bar to show how much you are over the limit — a gentle reminder to ease off the gas:



Using LiDAR input, the cluster can also warn of obstacles on the road ahead:



And if that’s not enough, the cluster provides intelligent parking assist to help you back into tight spaces. Here, for example, is an impromptu image we took in the QNX garage. The blue-and-yellow guidelines represent the car’s reverse trajectory, and the warning on right says that you are about to run over an esteemed member of the QNX concept team!



The rear- and side-view mirrors that aren’t really mirrors
By their very nature, car mirrors have blind spots. To address this problem, the QNX concept team has transformed the car’s rear- and side-view mirrors into video displays that offer a complete view of the scene behind and to the sides of the vehicle. As you can see in this image, the side-view displays can also display a red overlay to warn of cars, bikes, people, or anything else approaching the car’s blind zones:



The ADAS display for enhancing obstacle awareness
I don’t have pictures yet, but the car also includes an innovative LED-based display lets you gauge the direction and proximity of objects to the front, rear, and sides of the vehicle — without having to take your eyes off the road. Stretching the width of the dash, the display integrates input from the car’s ultrasonic and LiDAR sensors to provide a centralized view of ADAS warnings.

The easy-to-use infotainment system
To demonstrate the capabilities of the QNX CAR™ Platform for Infotainment, we’ve outfitted the car with a feature-rich, yet intuitive, multimedia head unit. For instance, see the radio tuner in the following image? That’s no ordinary tuner. To change channels, you can just swipe across the display; if your swipe isn’t perfectly accurate, the radio will automatically zero in on the nearest station or preset.

Better yet, the radio offers “iHeart drive anywhere radio.” If you drive out of range of your favorite AM/FM radio station, the system will detect the problem and automatically switch to the corresponding digital iHeartRadio station. How cool is that?



Other infotainment features include:
  • Natural voice recognition — For instance, if you say “It’s way too cold in here,” the HVAC system will respond by raising the heat.
  • Integration with a wide variety of popular smartphones.
  • Support for multiple concurrent app environments, along with a variety of Android and HTML5 apps, as well as an HMI built with the Qt framework.
  • A backseat display that lets passengers control HVAC functions, navigation, song selection, and other infotainment features.

The oh-so-awesome partners
The car is a testament not only to QNX technology, but to the ecosystem of technology partners that provide complementary solutions for QNX customers. Peek under the hood, and you'll find the latest tech from Elektrobit, iHeart, Nuance, Pandora, Parkopedia, Phantom Intelligence, Qualcomm, RealVNC, Rightware, and TE Connectivity.

The other stuff
Do not, for one minute, think that the Maserati is the only attraction in the QNX booth. Far from it. We will also showcase a significantly revamped QNX reference vehicle, outfitted with lane departure warnings, traffic sign recognition, and other ADAS features, as well as the latest version of the QNX CAR Platform — more in an upcoming post.

Visitors to the booth will also have the opportunity to experience:
  • a 3D navigation solution from Aisin AW
  • a digital instrument cluster designed by HI Corporation
  • two QNX CAR Platform demo systems, one powered by a dual-core Intel Atom E3827 processor, the other by an NVIDIA Tegra Visual Computing Module
  • the latest incarnation of the Oscar-winning Flying Cam SARAH aerial camera system


One day I’ll be Luke Skywalker

Cyril Clocher
What happens when you blend ADAS with infotainment? Guest post by Cyril Clocher, business manager for automotive processors at Texas Instruments

As we all begin preparing for our trek to Vegas for CES 2015, I would like my young friends (born in the 70s, of course) to reflect on their impressions of the first episode of Lucas’s trilogy back in 1977. On my side, I perfectly remember thinking one day I would be Luke Skywalker.

The eyes of young boys and girls were literally amazed by this epic space opera and particularly by technologies used by our heroes to fight the Galactic Empire. You have to remember it was an era where we still used rotary phones and GPS was in its infancy. So you can imagine how impactful it was for us to see our favorite characters using wireless electronic gadgets with revolutionary HMIs such as natural voice recognition, gesture controls or touch screens; droids speaking and enhancing human intelligence; and autonomous vehicles traveling the galaxy safely while playing chess with a Wookiee. Now you’re with me…

But instead of becoming Luke Skywalker a lot of us realized that we would have a bigger impact by inventing or engineering these technologies and by transforming early concepts into real products we all use today. As a result, smartphones and wireless connectivity are now in our everyday lives; the Internet of Things (IoT) is getting more popular in applications such as activity trackers that monitor personal metrics; and our kids are more used to touch screens than mice or keyboards, and cannot think of on-line gaming without gesture control. In fact, I just used voice recognition to upgrade the Wi-Fi plan from my Telco provider.

But the journey is not over yet. Our generation has still to deliver an autonomous vehicle that is green, safe, and fun to control – I think the word “drive” will be obsolete for such a vehicle.

The automotive industry has taken several steps to achieve this exciting goal, including integration of advanced and connected in-car infotainment systems in more models as well as a number of technologies categorized under Advanced Driver Assistance Systems (ADAS) that can create a safer and unique driving experience. From more than a decade, Texas Instruments has invested in infotainment and ADAS: “Jacinto” and TDAx automotive processors as well as the many analog companion chips supporting these trends.

"Jacinto 6 EP" and "Jacinto 6 Ex"
infotainment processor
s
A unique approach of TI is our capability to leverage best of both worlds for non-safety critical features, and to provide a seamless integration of informational ADAS functions into existing infotainment systems so the vehicle better informs and warns the driver. We announced that capability at SAE Convergence in Detroit in October 2014 with the “Jacinto 6 Ex” processor (DRA756), which contains powerful CPU, graphics multimedia, and radio cores with differentiated vision co-processors, called embedded vision engines (EVE), and additional DSPs that perform the complex ADAS processing.

For the TI’s automotive team, the CES 2015 show is even more exciting than in previous years, as we’ve taken our concept of informational ADAS to the next step. With joint efforts and hard work from both TI and QNX teams, we’ve together implemented a real informational ADAS system running the QNX CAR™ Platform for Infotainment on a “Jacinto 6 Ex” processor.

I could try describing this system in detail, but just like the Star Wars movies, it’s best to experience our “Jacinto 6 Ex” and QNX CAR Platform-based system in person. Contact your TI or QNX representative today and schedule a meeting to visit our private suite at CES at the TI Village (N115-N119) or to immerse yourself in a combined IVI, cluster, megapixel surround view, and DLP® based HUD display with augmented reality running on a single “Jacinto 6 Ex” SoC demonstration. And don't forget to visit the QNX booth (2231), where you can see the QNX reference vehicle running a variety of ADAS and infotainment applications on “Jacinto 6” processors.

Integrated cockpit featuring DLP powered HUD and QNX CAR Platform running on a single “Jacinto 6 Ex” SoC.
One day I’ll experience Skywalker’s life as I will no doubt have the opportunity to control an intelligent and autonomous vehicle with my biometrics, voice, and gestures while riding with my family to the movie theater playing chess with my grandkids, not yet a Wookiee.

The power of together

Bringing more technologies into the car is all well and good. The real goal, however, is to integrate them in a way that genuinely improves the driving experience.

Can we all agree that ‘synergy’ has become one of the most misused and overused words in the English language? In the pantheon of verbal chestnuts, synergy holds a place of honor, surpassed only by ‘best practices’ and ‘paradigm shift’.

Mind you, you can’t blame people for invoking the word so often. Because, as we all know, the real value in things often comes from their interaction — the moment they stop acting alone and start working in concert. The classic example is water, yeast, and flour, a combination that yields something far more flavorful than its constituent parts. I am speaking, of course, of bread.

Automakers get this principle. Case in point: adaptive cruise control, which takes a decades-old concept — conventional cruise control — and marries it with advances in radar sensors and digital signal processing. The result is something that doesn’t simply maintain a constant speed, but can help reduce accidents and, according to some research, traffic jams.

At QNX Software Systems, we also take this principle to heart. For example, read my recent post on the architecture of the QNX CAR Platform and you’ll see that we consciously designed the platform to help things work together. In fact, the platform's ability to integrate numerous technologies, in a seamless and concurrent fashion, is arguably its most salient quality.

This ability to blend disparate technologies into a collaborative whole isn't just a gee-whiz feature. Rather, it is critical to enabling the continued evolution and success of the connected car. Because it’s not enough to have smartphone connectivity. Or cloud connectivity. Or digital instrument clusters. Or any number of ADAS features, from collision warnings to autonomous braking. The real magic, and real value to the consumer, occurs when some or all of these come together to create something greater than the sum of the parts.

Simply put, it's all about the — dare I say it? — synergy that thoughtful integration can offer.

At CES this year, we will explore the potential of integration and demonstrate the unexpected value it can bring. The story begins on the QNX website.

A need for speed... and safety

Matt Shumsky
Matt Shumsky
For me, cars and safety go hand in hand. Don’t get me wrong, I have a need for speed. I do, after all, drive a 2006 compact with 140 HP (pause for laughter). But no one, and I mean no one, wants to be barreling down a highway in icy conditions at 120 km/hr without working brakes, am I right?

So this begs the question, what’s the best way to design a software system that ensures the adaptive cruise control system keeps a safe distance from the car ahead? Or that tells the digital instrument cluster the correct information to display? And how can you make sure the display information isn’t corrupted?

Enter QNX and the ISO 26262 functional safety standard.

QNX Software Systems is partnering with LDRA to present a webinar on “Ensuring Automotive Functional Safety”. During this webinar, you’ll learn about:
  • Development and verification tools proven to help provide safer automotive software systems
  • How suppliers can develop software systems faster with an OS tuned for automotive safety

Ensuring Automotive Functional Safety with QNX and LDRA
Thursday, November 20, 2014
9:00 am PST / 12:00 pm EST / 5:00 pm UTC

I hope you can join us!

Japan update: ADAS, wearables, integrated cockpits, and autonomous cars

Yoshiki Chubachi
Yoshiki Chubachi
Will the joy of driving be a design criterion for tomorrow’s vehicles? It had better be.

A couple of weeks ago, QNX Software Systems sponsored Telematics Japan in Tokyo. This event offers a great opportunity to catch up with colleagues from automotive companies, discuss technology and business trends, and showcase the latest technology demos. Speaking of which, here’s a photo of me with a Japan-localized demo of the QNX CAR Platform. You can also see a QNX-based digital instrument cluster in the lower-left corner — this was developed by Three D, one of our local technology partners:



While at the event, I spoke on the panel, “Evolving ecosystems for future HMI, OS, and telematics platform development.” During the discussion, we conducted a real-time poll and asked the audience three questions:

1) Do you think having Apple CarPlay and Android Auto will augment a vehicle brand?
2) Do you expect wearable technologies to be integrated into cars?
3) If your rental car were hacked, who would you complain to?

For question 1, 32% of the audience said CarPlay and Android Auto will improve a brand; 68% didn't think so. In my opinion, this result indicates that smartphone connectivity in cars is now an expected feature. For question 2, 76% answered that they expect to see wearables integrated into cars. This response gives us a new perspective — people are looking at wearables as a possible addition to go with ADAS systems. For example, a wearable device could help prevent accidents by monitoring the driver for drowsiness and other dangerous signs. For question 3, 68% said they would complain to the rental company. Mind you, this raises the question: if your own car were hacked, who would you complain to?

Integrated cockpits
There is growing concern around safety and security as companies attempt to grow more business by leveraging connectivity in cars. The trend is apparent if you look at the number of safety- and security-related demos at various automotive shows.

Case in point: I recently attended a private automotive event hosted by Renesas, where many ADAS and integrated cockpit demos were on display. And last month, CEATEC Japan (aka the CES of Japan) featured integrated cockpit demos from companies like Fujitsu, Pioneer, Mitsubishi, Kyocera, and NTT Docomo.

For the joy of it
Things are so different from when I first started developing in-car navigation systems 20 years ago. Infotainment systems are now turning into integrated cockpits. In Japan, the automotive industry is looking at early 2020s as the time when commercially available autonomous cars will be on the road. In the coming years, the in-car environment, including infotainment, cameras and other systems, will change immensely — I’m not exactly sure what cars in the year 2020 will look like, but I know it will be something I could never have imagined 20 years ago.

A panel participant at Telematics Japan said to me, “If autonomous cars become reality and my car is not going to let me drive anymore, I am not sure what the point of having a car is.” This is true. As we continue to develop for future cars, we may want to remind ourselves of the “joy of driving” factor.

Are you ready to stop micromanaging your car?

I will get to the above question. Honest. But before I do, allow me to pose another one: When autonomous cars go mainstream, will anyone even notice?

The answer to this question depends on how you define the term. If you mean completely and absolutely autonomous, with no need for a steering wheel, gas pedal, or brake pedal, then yes, most people will notice. But long before these devices stop being built into cars, another phenomenon will occur: people will stop using them.

Allow me to rewind. Last week, Tesla announced that its Model S will soon be able to “steer to stay within a lane, change lanes with the simple tap of a turn signal, and manage speed by reading road signs and using traffic-aware cruise control.” I say soon because these functions won't be activated until owners download a software update in the coming weeks. But man, what an update.

Tesla may now be at the front of the ADAS wave, but the wave was already forming — and growing. Increasingly, cars are taking over mundane or hard-to-perform tasks, and they will only become better at them as time goes on. Whether it’s autonomous braking, automatic parking, hill-descent control, adaptive cruise control, or, in the case of the Tesla S, intelligent speed adaptation, cars will do more of the driving and, in so doing, socialize us into trusting them with even more driving tasks.

Tesla Model S: soon with autopilot
In other words, the next car you buy will prepare you for not having to drive the car after that.

You know what’s funny? At some point, the computers in cars will probably become safer drivers than humans. The humans will know it, but they will still clamor for steering wheels, brake pedals, and all the other traditional accoutrements of driving. Because people like control. Or, at the very least, the feeling that control is there if you want it.

It’s like cameras. I would never think of buying a camera that didn’t have full manual mode. Because control! But guess what: I almost never turn the mode selector to M. More often than not, it’s set to Program or Aperture Priority, because both of these semi-automated modes are good enough, and both allow me to focus on taking the picture, not on micromanaging my camera.

What about you? Are you ready for a car that needs a little less micromanagement?

A glaring look at rear-view mirrors

Some reflections on the challenge of looking backwards, followed by the vexing question: where, exactly, should video from a backup camera be displayed?

Mirror, mirror, above the dash, stop the glare and make it last! Okay, maybe I've been watching too many Netflix reruns of Bewitched. But mirror glare, typically caused by bright headlights, is a problem — and a dangerous one. It can create temporary blind spots on your retina, leaving you unable to see cars or pedestrians on the road around you.

Automotive manufacturers have offered solutions to this problem for decades. For instance, many car mirrors now employ electrochromism, which allows the mirror to dim automatically in response to headlights and other light sources. But when, exactly, did the first anti-glare mirrors come to market?

According to Wikipedia, the first manual-tilt day/night mirrors appeared in the 1930s. These mirrors typically use a prismatic, wedge-shaped design in which the rear surface (which is silvered) and the front surface (which is plain glass) are at angles to each other. In day view, you see light reflected off the silvered rear surface. But when you tilt the mirror to night view, you see light reflected off the unsilvered front surface, which, of course, has less glare.

Manual-tilt day/night mirrors may have debuted in the 30s, but they were still a novelty in the 50s. Witness this article from the September 1950 issue of Popular Science:



True to their name, manual-tilt mirrors require manual intervention: You have to take your hand off the wheel to adjust them, after you’ve been blinded by glare. Which is why, as early as 1958, Chrysler was demonstrating mirrors that could tilt automatically, as shown in this article from the October 1958 issue of Mechanix Illustrated:


Images: Modern Mechanix blog

Fast-forward to backup cameras
Electrochromic mirrors, which darken electronically, have done away with the need to tilt, either manually or automatically. But despite their sophistication, they still can't overcome the inherent drawbacks of rear-view mirrors, which provide only a partial view of the area behind the vehicle — a limitation that contributes to backover accidents, many of them involving small children. Which is why NHTSA has mandated the use of backup cameras by 2018 and why the last two QNX technology concept cars have shown how video from backup cameras can be integrated with other content in a digital instrument cluster.

Actually, this raises the question: just where should backup video be displayed? In the cluster, as demonstrated in our concept cars? Or in the head unit, the rear-view mirror, or a dedicated screen? The NHTSA ruling doesn’t mandate a specific device or location, which isn't surprising, as each has its own advantages and disadvantages.

Consider, for example, ease of use: Will drivers find one location more intuitive and less distracting than the alternatives? In all likelihood, the answer will vary from driver to driver and will depend on individual cognitive styles, driving habits, and vehicle design.

Another issue is speed of response. According to NHTSA’s ruling, any device displaying backup video must do so within 2.5 seconds of the car shifting into the reverse. Problem is, the ease of complying with this requirement depends on the device in question. For instance, NHTSA acknowledges that “in-mirror displays (which are only activated when the reverse gear is selected) may require additional warm-up time when compared to in-dash displays (which may be already in use for other purposes such as route navigation).”

At first blush, in-dash displays such as head units and digital clusters have the advantage here. But let’s remember that booting quickly can be a challenge for these systems because of their greater complexity — many offer a considerable amount of functionality. So imagine what happens when the driver turns the ignition key and almost immediately shifts into reverse. In that case, the cluster or head unit must boot up and display backup video within a handful of seconds. It's important, then, that system designers choose an OS that not only supports rich functionality, but also allows the system to start up and initialize applications in the least time possible.

Ontario tech companies team up to target the connected car

To predict who will play a role tomorrow's connected vehicles, you need to look beyond the usual suspects.

When someone says “automobile,” what’s the first word that comes to mind? Chances are, it isn’t Ontario. And yet Ontario — the Canadian province that is home to QNX headquarters — is a world-class hub of automotive R&D and manufacturing. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here. As do 350 parts suppliers. In fact, Ontario produced 2.5 million vehicles in 2012 alone.

No question, Ontario has the smarts to build cars. But to fully appreciate what Ontario has to offer, you need to look beyond the usual suspects in the auto supply chain. Take QNX Software Systems, for example. Our roots are in industrial computing, but in the early 2000s we started to offer software technology and expertise to the world’s automakers and tier one suppliers. And now, a decade later, QNX offers the premier platform for in-car infotainment, with deployments in tens of millions of vehicles.

QNX Software Systems is not alone. Ontario is home to many other “non-automotive” technology companies that are playing, or are poised to play, a significant role in creating new automotive experiences. But just who are these companies? The Automotive Parts Manufacturers Association (APMA) of Canada would like you to know. Which is why they've joined forces with QNX and other partners to build the APMA Connected Vehicle.

A showcase for Ontario technology.
The purpose of the vehicle is simple: to showcase how Ontario companies can help create the next generation of connected cars. The vehicle is based on a Lexus RX350 — built in Ontario, of course — equipped with a custom-built infotainment system and digital instrument cluster built on QNX technology. Together, the QNX systems integrate more than a dozen technologies and services created in Ontario, including gesture recognition, biometric security, emergency vehicle notification, LED lighting, weather telematics, user interface design, smartphone charging, and cloud connectivity.

Okay, enough from me. Time to nuke some popcorn, dim the lights, and hit the Play button: