Showing posts with label Android. Show all posts
Showing posts with label Android. Show all posts

A question of architecture

The second of a series on the QNX CAR Platform. In this installment, we start at the beginning — the platform’s underlying architecture.

In my previous post, I discussed how infotainment systems must perform multiple complex tasks, often all at once. At any time, a system may need to manage audio, show backup video, run 3D navigation, synch with Bluetooth devices, display smartphone content, run apps, present vehicle data, process voice signals, perform active noise control… the list goes on.

The job of integrating all these functions is no trivial task — an understatement if ever there was one. But as with any large project, starting with the right architecture, the right tools, and the right building blocks can make all the difference. With that in mind, let’s start at the beginning: the underlying architecture of the QNX CAR Platform for Infotainment.

The architecture consists of three layers: human machine interface (HMI), middleware, and platform.



The HMI layer
The HMI layer is like a bonus pack: it supports two reference HMIs out of the box, both of which have the same appearance and functionality. So what’s the difference? One is based on HTML5, the other on Qt 5. This choice demonstrates the underlying flexibility of the platform, which allows developers to create an HMI with any of several technologies, including HTML5, Qt, or a third-party toolkit such as Elektrobit GUIDE or Crank Storyboard.

A choice of HMIs
Mind you, the choice goes further than that. When you build a sophisticated infotainment system, it soon becomes obvious that no single tool or technology can do the job. The home screen, which may contain controls for Internet radio, hands-free calls, HVAC, and other functions, might need an environment like Qt. The navigation app, for its part, will probably use OpenGL ES. Meanwhile, some applications might be based on Android or HTML5. Together, all these heterogeneous components make up the HMI.

The QNX CAR Platform embraces this heterogeneity, allowing developers to use the best tools and application environments for the job at hand. More to the point, it allows developers to blend multiple app technologies into a single, unified user interface, where they can all share the same display, at the same time.

To perform this blending, the platform employs several mechanisms, including a component called the graphical composition manager . This manager acts as a kind of universal framework, providing all applications, regardless of how they’re built, with a highly optimized path to the display.

For example, look at the following HMI:



Now look at the HMI from another angle to see how it comprises several components blended together by the composition manger:



To the left, you see video input from a connected media player or smartphone. To the right, you see a navigation application based on OpenGL ES map-rendering software, with an overlay of route metadata implemented in Qt. And below, you see an HTML page that provides the underlying wallpaper; this page could also display a system status bar and UI menu bar across all screens.

For each component rendered to the display, the graphical composition manager allocates a separate window and frame buffer. It also allows the developer to control the properties of each individual window, including location, transparency, rotation, alpha, brightness, and z-order. As a result, it becomes relatively straightforward to tile, overlap, or blend a variety of applications on the same screen, in whichever way creates the best user experience.

The middleware layer
The middleware layer provides applications with a rich assortment of services, including Bluetooth, multimedia discovery and playback, navigation, radio, and automatic speech recognition (ASR). The ASR component, for example, can be used to turn on the radio, initiate a Bluetooth phone call from a connected smartphone, or select a song by artist or song title.

I’ll drill down into several of these services in upcoming posts. For now, I’d like to focus on a fundamental service that greatly simplifies how all other services and applications in the system interact with one another. It’s called persistent/publish subscribe messaging, or PPS, and it provides the abstraction needed to cleanly separate high-level applications from low-level business logic and services.

PPS messaging provides an abstraction layer between system services and high-level applications

Let’s rewind a minute. To implement communications between software components, C/C++ developers must typically define direct, point-to-point connections that tend to “break” when new features or requirements are introduced. For instance, an application communicates with a navigation engine, but all connections enabling that communication must be redefined when the system is updated with a different engine.

This fragility might be acceptable in a relatively simple system, but it creates a real bottleneck when you are developing something as complex, dynamic, and quickly evolving as the design for a modern infotainment system. PPS addresses the problem by allowing developers to create loose, flexible connections between components. As a result, it becomes much easier to add, remove, or replace components without having to modify other components.

So what, exactly, is PPS? Here’s a textbook answer: an asynchronous object-based system that consists of publishers and subscribers, where publishers modify the properties of data objects and the subscribers to those objects receive updates when the objects have been modified.

So what does that mean? Well, in a car, PPS data objects allow applications to access services such as the multimedia engine, voice recognition engine, vehicle buses, connected smartphones, hands-free calling, and contact databases. These data objects can each contain multiple attributes, each attribute providing access to a specific feature — such as the RPM of the engine, the level of brake fluid, or the frequency of the current radio station. System services publish these objects and modify their attributes; other programs can then subscribe to the objects and receive updates whenever the attributes change.

The PPS service is programming-language independent, allowing programs written in a variety of programming languages (C, C++, HTML5, Java, JavaScript, etc.) to intercommunicate, without any special knowledge of one another. Thus, an app in a high-level environment like HTML5 can easily access services provided by a device driver or other low-level service written in C or C++.

I’m only touching on the capabilities of PPS. To learn more, check out the QNX documentation on this service.

The platform layer
The platform layer includes the QNX OS and the board support packages, or BSPs, that allow the OS to run on various hardware platforms.

An inherently modular and extensible architecture
A BSP may not sound like the sexiest thing in the world — it is, admittedly, a deeply technical piece of software — but without it, nothing else works. And, in fact, one reason QNX Software Systems has such a strong presence in automotive is that it provides BSPs for all the popular infotainment platforms from companies like Freescale, NVIDIA, Qualcomm, and Texas Instruments.

As for the QNX Neutrino OS, you could write a book about it — which is another way of saying it’s far beyond the scope of this post. Suffice it to say that its modularity, extensibility, reliability, and performance set the tone for the entire QNX CAR Platform. To get a feel for what the QNX OS brings to the platform (and by extension, to the automotive industry), I invite you to visit the QNX Neutrino OS page on the QNX website.

Attending SAE Convergence? Here’s why you should visit booth 513

Cars and beer don’t mix. But discussing cars while having a beer? Now you’re talking. If you’re attending SAE Convergence next week, you owe it to yourself to register for our “Spirits And Eats” event at 7:00 pm Tuesday. It’s the perfect occasion to kick back and enjoy the company of people who, like yourself, are passionate about cars and car electronics. And it isn’t a bad networking opportunity either — you’ll meet folks from a variety of automakers, Tier 1s, and technology suppliers in a relaxed, convivial atmosphere.

But you know what? It isn’t just about the beer. Or the company. It’s also about the Benz. Our digitally modded Mercedes-Benz CLA45 AMG, to be exact. It’s the latest QNX technology concept car, and it’s the perfect vehicle (pun fully intended) for demonstrating how QNX technology can enable next-generation infotainment systems. Highlights include:

  • A multi-modal user experience that blends touch, voice, and physical controls
  • A secure application environment for Android, HTML5, and OpenGL ES
  • Smartphone connectivity options for projecting smartphone apps onto the head unit
  • A dynamically reconfigurable digital instrument cluster that displays turn-by-turn directions, notifications of incoming phone calls, and video from front and rear cameras
  • Multimedia framework for playback of content from USB sticks, DLNA devices, etc.
  • Full-band stereo calling — think phone calls with CD quality audio
  • Engine sound enhancement that synchronizes synthesized engine sounds with engine RPM

Here, for example, is the digital cluster:



And here is a closeup of the head unit:



And here’s a shot of the cluster and head unit together:



As for the engine sound enhancement and high-quality hands-free audio, I can’t reproduce these here — you’ll have come see the car and experience them first hand. (Yup, that's an invite.)

If you like what you see, and are interested in what you can hear, visit us at booth #513. And if you'd like to schedule a demo or reserve some time with a QNX representative in advance, we can accommodate that, too. Just send us an email.

A question of concurrency

The first of a new series on the QNX CAR Platform for Infotainment. In this installment, I tackle the a priori question: why does the auto industry need this platform, anyway?

Define your terms, counseled Voltaire, and in keeping with his advice, allow me to begin with the following:

Concurrency \kən-kûr'-ən-sē\ n (1597) Cooperation, as of agents, circumstances, or events; agreement or union in action.

A good definition, as far as it goes. But it doesn’t go far enough for the purposes of this discussion. Wikipedia comes closer to the mark:

“In computer science, concurrency is a property of systems in which several computations execute simultaneously, and potentially interact with each other.”

That’s better, but it still falls short. However, the Wikipedia entry also states that:

“the base goals of concurrent programming include correctness, performance and robustness. Concurrent systems… are generally designed to operate indefinitely, including automatic recovery from failure, and not terminate unexpectedly.”

Now that’s more like it. Concurrency in computer systems isn’t simply a matter of doing several things all at once; it’s also a matter of delivering a solid user experience. The system must always be available and it must always be responsive: no “surprises” allowed.

This definition seems tailored-made for in-car infotainment systems. Here, for example, are some of the tasks that an infotainment system may perform:

  • Run a variety of user applications, from 3D navigation to Internet radio, based on a mix of technologies, including Qt, HTML5, Android, and OpenGL ES
  • Manage multiple forms of input: voice, touch, physical buttons, etc. 
  • Support multiple smartphone connectivity protocols such as MirrorLink and Apple CarPlay 
  • Perform services that smartphones cannot support, including:
    • HVAC control
    • discovery and playback of multimedia from USB sticks, DLNA devices, MTP devices, and other sources
    • retrieval and display of fuel levels, tire pressure, and other vehicle information
    • connectivity to Bluetooth devices
  • Process voice signals to ensure the best possible quality of phone-based hands-free systems — this in itself can involve many tasks, including echo and noise removal, dynamic noise shaping, speech enhancement, etc. 
  • Perform active noise control to eliminate unwanted engine “boom” noise 
  • Offer extremely fast bootup times; a backup camera, for example, must come up within a second or two to be useful
     
Jugging multiple concurrent tasks
The primary user of an infotainment system is the driver. So, despite juggling all these activities, an infotainment system must never show the strain. It must always respond quickly to user input and critical events, even when many activities compete for system resources. Otherwise, the driver will become annoyed or, worse, distracted. The passengers won’t be happy, either.

Still, that isn’t enough. Automakers also need to differentiate themselves, and infotainment serves as a key tool for achieving differentiation. So the infotainment system must not simply perform well; it must also allow the vehicle, or line of vehicles, to project the unique values, features, and brand identity of the automaker.

And even that isn’t enough. Most automakers offer multiple vehicle lines, each encompassing a variety of configurations and trim levels. So an infotainment design must also be scalable; that way, the work and investment made at the high end can be leveraged in mid-range and economy models. Because ROI.

Projecting a unique identity
But you know what? That still isn’t enough. An infotainment system design must also be flexible. It must, for example, support new functionality through software updates, whether such updates are installed through a storage device or over the air. And it must have the ability to accommodate quickly evolving connectivity protocols, app environments, and hardware platforms. All with the least possible fuss.

The nitty and the gritty
Concurrency, performance, reliability, differentiation, scalability, flexibility — a tall order. But it’s exactly the order that the QNX CAR Platform for Infotainment was designed to fill.

Take, for example, product differentiation. If you look at the QNX-powered infotainment systems that automakers are shipping today, one thing becomes obvious: they aren’t cookie-cutter systems. Rather, they each project the unique values, features, and brand identity of each automaker — even though they are all built on the same, standards-based platform.

So how does the QNX CAR Platform enable all this? That’s exactly what my colleagues and I will explore over the coming weeks and months. We’ll get into the nitty and sometimes the gritty of how the platform works and why it offers so much value to companies that develop infotainment systems in various shapes, forms, and price points.

Stay tuned.

POSTSCRIPT: Read the next installment of the QNX CAR Platform series, A question of architecture.

The wraps are off! First look at the new QNX technology concept car

A quick tour of one of the vehicles that QNX is unveiling at 2014 CES

You know what? Writing this post isn’t easy. All I’ve got are words and pictures, and neither could ever do justice to the user experience offered by the new QNX technology concept car. They cannot, for example, recreate the rich, luminous sound of the car’s full-band and wide-band hands-free calls. Nor can they evoke how the car blends speech recognition with a touch interface and physical controls to make navigation, Internet radio, and other applications wonderfully easy to use.

But on second thought, words and pictures aren’t that bad. Especially when the car — and the in-dash systems that the QNX concept team created for it — are so downright gorgeous. So what are we sitting around for? Time for a tour!

Actually... hold that thought. I just want to mention that, if you visit our Flickr page, you can find full-resolution versions of most of the images I've posted here. Because why settle for low res? Okay, back to the tour.

The car
I've got two things to say here. First, the car is based on a Mercedes-Benz CLA45 AMG. If you guessed the model correctly based on the teaser images we published on the QNX website, I bow in homage to your eagle eye. Second, while we snapped this photo in the QNX garage, don’t think for a minute that the garage is ever this neat and tidy. On any given day, it’s chock full of drill presses, tool boxes, work tables, embedded boards, and QNX engineers joyously modding the world’s coolest cars — exactly the kind of place you expect it to be. And want it to be! But to humor the photographer, we (temporarily) made this corner clutter-free. We're nice that way.



The dash
Let's get behind the wheel, where you can see the car's custom-built digital instrument cluster and infotainment system. The bold design, the clean layout, the super-easy-to-access controls — they all add up to systems you want to interact with. Just as important, the look-and-feel of the instrument cluster and infotainment system are totally different from the corresponding systems in our previous concept car — an excellent illustration of how the QNX platform can help customers create their own branded experiences.



The multi-talented cluster
Time to zoom in on the digital instrument cluster, which helps simplify driving tasks and minimize distraction with an impressive array of features. Turn-by-turn directions pulled from the navigation system? Check. Video feed from front and rear-view cameras? Check. Notifications of incoming phone calls? Check. Alerts of incoming text messages, which you can listen to at the touch of a steering-wheel button? Check.



The Android app support
Automakers want to tap into the talents of the mobile app community, and the QNX CAR Platform for Infotainment helps them do just that, with built-in support for Android, OpenGL ES, and HTML5. In the concept car, for example, you'll find an Android Jellybean version of iHeartRadio, Clear Channel’s digital radio service, running in a secure application container. The QNX CAR Platform takes this same sandboxed approach to running HTML5 apps — perfect for protecting both the HMI and the overall system from unpredictable web content:



Helping you get there in more ways than one
We designed the QNX CAR Platform to give automotive developers the greatest possible choice and flexibility. And that’s exactly what you see when it comes to navigation. For instance, the car supports navigation from Elektrobit:



and from HERE:



and from Kotei Informatics:



If that’s not enough, a demo system in the QNX booth at CES also demonstrates a navigation system from Aisin AW — more on that in an upcoming post.

Pardon me while I barge in
As I alluded earlier, what you can't see in the new concept car is just as important as what you can see. For instance, if you look at this image, you'll see the infotainment system's media player. But what you can't see is new acoustics technology from QNX that lets you "barge in" and issue voice commands even when a song is playing. How cool is that?



When you find yourself in times of trouble...
... don't let it be, but rather, check and see. And to do that, you can use the infotainment system's virtual mechanic, which keeps tabs on your car's health, including fluid levels, brake wear, and, in this case, low tire pressure:



The cloud connection
Hold on, what's this? It looks like a smartphone app with an interface similar to that of the virtual mechanic, above. In fact, it's a lot more than that, and it touches on some cool (and very new) technology that can help cars become fully managed citizens of the cloud. More on that in an upcoming post.



That's it for now. For more details on what QNX is showcasing this week at CES, check out the press releases posted on the QNX website. And stay tuned to this channel for further updates from 2014 CES — including a profile of our very new QNX technology concept car for acoustics.

Is this the most jazzed-up Jeep ever to hit CES?

The fourth installment in the CES Cars of Fame series. Our inductee for this week: a Jeep that gets personal.

Paul Leroux
It might not be as hip as the Prius or as fast as the Porsche. But it's fun, practical, and flexible. Better yet, you can drive it just about anywhere. Which makes it the perfect vehicle to demonstrate the latest features of the QNX CAR Platform for Infotainment.

It's called the QNX reference vehicle, and it's been to CES in Las Vegas, as well as to Detroit, New York City, and lots of places in between. It's our go-to vehicle for whenever we want to hit the road and showcase our latest infotainment technology. It even made a guest appearance at IBM's recent Information On Demand 2013 Big Data conference, where it demonstrated the power of connecting cars to the cloud.

The reference vehicle, which is based on a Jeep Wrangler, serves a different purpose than our technology concept cars. Those vehicles take the QNX CAR Platform as a starting point to demonstrate how the platform can help automakers hit new levels of innovation. The reference vehicle plays a more modest, but equally important, role: to show what our the platform can do out of the box.

For instance, we updated the Jeep recently to show how version 2.1 of the QNX CAR Platform will allow developers to blend a variety of application and HMI technologies on the same display. In this case, the Jeep's head unit is running a mix of native, HTML5, and Android apps on an HMI built with the Qt application framework:



Getting personal
We also use the Jeep to demonstrate the platform's support for customization and personalization. For instance, here is the first demonstration instrument cluster we created specifically for the Jeep:



And here's a more recent version:



These clusters may look very different, but they share the same underlying features, such as the ability to display turn-by-turn directions, weather updates, and other information provided by the head unit.

Keeping with the theme of personalization, the Jeep also demonstrates how the QNX CAR Platform allows developers to create re-skinnable HMIs. Here, for example, is a radio app in one skin:



And here's the same app in a different skin:



This re-skinnability isn't just cool; it also demonstrates how the QNX CAR Platform can help automotive developers create a single underlying code base and re-use it across multiple vehicle lines. Good, that.

Getting complementary
The Jeep is also the perfect vehicle to showcase the ecosystem of complementary apps and services integrated with the QNX CAR Platform, such as the (very cool) street director navigation system from Elektrobit:



To return to the question, is this really the most jazzed-up Jeep to hit CES? Well, it will be making a return trip to CES in just a few weeks, with a whole new software build. So if you're in town, drop by and let us know what you think.