The patent's assignee is
News editors obtained the following quote from the background information supplied by the inventors: "An increasing number of vehicles are being equipped with one or more independent computer and electronic processing systems. Certain of the processing systems are provided for vehicle operation or efficiency. For example, many vehicles are now equipped with computer systems for controlling engine parameters, brake systems, tire pressure and other vehicle operating characteristics. A diagnostic system may also be provided that collects and stores information regarding the performance of the vehicle's engine, transmission, fuel system and other components. The diagnostic system can typically be connected to an external computer to download or monitor the diagnostic information to aid a mechanic during servicing of the vehicle.
"Additionally, other processing systems may be provided for vehicle driver or passenger comfort and/or convenience. For example, vehicles commonly include navigation and global positioning systems and services, which provide travel direction and emergency roadside assistance. Vehicles are also provided with multimedia entertainment systems that include sound systems, e.g., satellite radio, broadcast radio, compact disk and MP3 players and video players. Still further, vehicles may include cabin climate control, electronic seat and mirror repositioning and other operator comfort features.
"However, each of the above processing systems is independent, non-integrated and incompatible. That is, such processing systems provide their own sensors, input and output devices, power supply connections and processing logic. Moreover, such processing systems may include sophisticated and expensive processing components, such as application specific integrated circuit (ASIC) chips or other proprietary hardware and/or software logic that are incompatible with other processing systems in the vehicle or the surrounding environment.
"Additionally, consumers use their smart phones for many things (there is an app for that). They want to stay connected and bring their digital worlds along when they are driving a vehicle. They expect consistent experiences as they drive. But, smartphones and vehicles are two different worlds. While the smartphone enables their voice and data to roam with them, their connected life experiences and application (app)/service relationships do not travel with them in a vehicle.
"Consider a vehicle as an environment that has ambient intelligence by virtue of its sensory intelligence, IVI (in-vehicle infotainment) systems, and other in-vehicle computing or communication devices. The temporal context of this ambient intelligent environment of the vehicle changes dynamically (e.g., the vehicle's speed, location, what is around the vehicle, weather, etc. changes dynamically) and the driver may want to interact in this ambient intelligent environment with mobile apps and, or cloud based services. However, conventional systems are unable to react and adapt to these dynamical changing environments.
"As computing environments become distributed, pervasive and intelligent, multi-modal interfaces need to be designed that leverage the ambient intelligence of the environment, the available computing resources (e.g., apps, services, devices, in-vehicle processing subsystems, an in-vehicle heads-up display (HUD), an extended instrument cluster, a Head-Unit, navigation subsystems, communication subsystems, media subsystems, computing resources on mobile devices carried into a vehicle or mobile devices coupled to an in-vehicle communication subsystem, etc.), and the available interaction resources. Interaction resources are end points (e.g., apps, services, devices, etc.) through which a user can consume (e.g., view, listen or otherwise experience) output produced by another resource. However, is difficult to design a multi-modal experience that adapts to a dynamically changing environment. The changes in the environment may he the availability or unavailability of a resource, such as an app, service or a device, a change in the context of the environment, or temporal relevance. Given the dynamic changes in the ambient intelligent environment, the user experience needs to transition smoothly from one context of use to another context while conforming to the constraints and maintaining consistent usability and relevance.
"Today, there is a gap between the actual tasks a user should be able to perform and the user interfaces exposed by the applications and services to support those tasks while conforming to the dynamically changing environments and related constraints. This gap exists because the user interfaces are typically not designed for dynamically changing environments and they cannot be distributed across devices in ambient intelligent environments.
"There is a need for design frameworks that can be used to create interactive, multi-modal user experiences for ambient intelligent environments. The diversity of contexts of use such user interfaces need to support require them to work across the heterogeneous interaction resources in the environment and provide dynamic binding with ontologically diverse applications and services that want to be expressed.
"Some conventional systems provide middleware frameworks that enable services to interoperate with each other while running on heterogeneous platforms; but, these conventional frameworks do not provide adaptive mapping between the actual tasks a user should be able to perform and the user interfaces exposed by available resources to support those tasks.
"There is no framework available today that can adapt and transform the user interface for any arbitrary service at run-time to support a dynamically changing environment. Such a framework will need to support on-the-fly composition of user interface elements, such that the overall experience remains contextually relevant, optimizing the available resources while conforming to any environmental constraints. Further, the framework must ensure that the resulting user interface at any point in time is consistent, complete and continuous; consistent because the user interface must use a limited set of interaction patterns consistently to present the interaction modalities of any task; complete because all interaction tasks that are necessary to achieve a goal must be accessible to the user regardless of which devices may be available in the environment; continuous because the framework must orchestrate and manage all transitions as one set of tasks in a progression to another set of tasks. No such framework exists today that, visualizes and distributes user interfaces dynamically to enable the user to interact with an ambient computing environment by allocating tasks to interaction resources in a manner that the overall experience is consistent, complete, and continuous."
As a supplement to the background information on this patent application, VerticalNews correspondents also obtained the inventors' summary information for this patent application: "A system and method for providing an adaptive experience framework for an ambient intelligent environment are disclosed herein in various example embodiments. An example embodiment provides a user experience framework that can be deployed to deliver consistent experiences that adapt to the changing context of a vehicle and the user's needs and is inclusive of any static and dynamic applications, services, devices, and users. Apart from delivering contextually relevant and usable experiences, the framework of an example embodiment also addresses distracted driving, taking into account the dynamically changing visual, manual and cognitive workload of the driver.
"The framework of an example embodiment provides a multi-modal and integrated experience that adapts to a dynamically changing environment. The changes in the environment may be caused by the availability or unavailability of a resource, such as an app, service or a device; or a change in the temporal context of the environment; or a result of a user's interaction with the environment. As used herein, temporal context corresponds to time-dependent, dynamically changing events and signals in an environment. In a vehicle-related embodiment, temporal context can include the speed of the vehicle (and other sensory data from the vehicle, such as fuel level, etc.), location of the vehicle, local traffic at that moment and place, local weather, destination, time of the day, day of the week, etc. Temporal relevance is the act of making sense of these context-changing events and signals to filter out signal from noise and to determine what is relevant in the here and now. The various embodiments described herein use a goal-oriented approach to determine how a driver's goals (e.g., destination toward which the vehicle is headed, media being played/queued, conversations in progress/likely, etc.) might change because of a trigger causing a change in the temporal context The various embodiments described herein detect a change in temporal context to determine (reason and infer) what is temporally relevant. Further, some embodiments infer not only what is relevant right now, but also predict what is likely to be relevant next. Given the dynamic changes in the ambient intelligent environment, the user experience transitions smoothly from one context of use to another context while conforming to the constraints and maintaining consistent usability and relevance.
"The framework of an example embodiment also adapts to a dynamically changing environment as mobile devices, and the mobile apps therein, are brought into the environment. Because the presence of new mobile devices and mobile apps brought into the environment represent additional computing platforms and services, the framework of an example embodiment dynamically and seamlessly integrates these mobile devices and mobile apps into the environment and into the user experience. In a vehicle-related environment, an embodiment adapts to the presence of mobile devices and mobile apps as these devices are brought within proximity of a vehicle; and the apps are active and available on the mobile device. The various embodiments integrate these mobile devices/apps into the vehicle environment and with the other vehicle computing subsystems available therein. This integration is non-trivial as there may be multiple mobile apps that a use might want to consume; but, each mobile app may be developed by potentially different developers who use different user interfaces and/or different application programming interfaces (APIs). Without the framework of the various embodiments, the variant interfaces between mobile apps would cause the user interface to change completely when the user switched from one app or one vehicle subsystem to another. This radical switch in the user interface occurs in conventional systems when the user interface of a foreground application completely takes over all of the available interaction resources. This radical switch in the user interface can be confusing to a driver and can increase the driver's workload, which can lead to distracted driving as the driver tries to disambiguate the change in the user interface context from one app to another. In some cases, multiple apps cannot be consumed as such by the driver in a moving car, if the user interface completely changes from one app to the next. For example, the duration and frequency of interactions required by the user interface may make it unusable in the context of a moving car. Further, when the driver is consuming a given application, a notification from another service or application can be shown overlaid on top of the foreground application. However, consuming the notification means switching to the notifying app where the notification can be dealt with/actioned. Context switching of apps, again, increases the driver workload as the switched app is likely to look and feel differently and to have its own interaction paradigm.
"The various embodiments described herein eliminate this radical user interface switch when mobile devices/apps are brought into the environment by providing an inclusive framework to consume multiple applications (by way of their intents) in one, integrated user experience. The various embodiments manage context switching, caused by application switching, through the use of an integrated user experience layer where several applications can be plugged in simultaneously. Each application can be expressed in a manner that does not consume all the available interaction resources. Instead, a vertical slice (or other user interface portion or application intent) from each of the simultaneously in use applications can be expressed using a visual language and interaction patterns that make the presentation of each of the simultaneously in-use tasks homogenous, thereby causing the user experience to be consistent across each of the in-use applications.
"The embodiments described herein specify the application in terms of its intent(s), that is, the set of tasks that help a user accomplish a certain goal. The application intent could be enabling a user task (or an activity), a service, or delivering, a notification to the user. The application's intent can be specified in application messages. These messages can carry the information required to understand the temporal intent of the application in terms of the object (e.g., the noun or content) of the application, the input/output (I/O) modality of the intent/task at hand (e.g., how to present the object to the user), and the actions (e.g., the verbs associated with the application) that can be associated with the task at hand (the intent). As such, an intent as used herein can refer to a message, event, or request associated with a particular task, application, or service in a particular embodiment. One example embodiment provides a Service Creation interface that enables the developer of the application or service to describe their application's intent so that the application's intent can be handled/processed at run-time. The description of the application's intent can include information, such as the Noun (object) upon which the application will act, the Verbs or the action or actions that can be taken on that Noun, and the Interaction and Launch Directives that specify how to interact with that object and launch a target action or activity the callback application programming interface--API to use). In other words, the Service Creation interface enables a developer to describe their application in terms of intents and related semantics using a controlled vocabulary of Nouns and Verbs that represent well-defined concepts specified in an environment-specific ontology. Further, an application intent description can also carry metadata, such as the application's domain or category, context of use, criticality, time sensitivity, etc. enabling the system to deal appropriately with the temporal intent of the application.
"The application's temporal intent description can be received by a particular embodiment as messages. The metadata in the messages can be used to filter, order, and queue the received messages for further processing. The further processing can include transforming the messages appropriately for presentation to the user so that the messages are useful, usable, and desirable. In the context of a vehicle, the processing can also include presenting the messages to the user in a manner that is vehicle-appropriate using a consistent visual language with minimal interaction patterns (keeping only what is required to disambiguate the interaction) that are carefully designed to minimize driver distraction. The processing of ordered application intent description messages includes mapping the particular application intent descriptions to one or more tasks that will accomplish the described application intent. Further, the particular application intent descriptions can be mapped onto abstract I/O objects, At run-time, the abstract I/O objects can he visualized by mapping the abstract I/O objects onto available concrete I/O resources. The various embodiments also perform processing operations to determine where, how, and when to present application information to the user in a particular environment, so that the user can use the application, obtain results, and achieve their goals. Any number of application intent descriptions, from one or more applications, can he requested or published to the various embodiments for concurrent presentation to a user. The various intents received from one or more applications get filtered and ordered based on the metadata, such as criticality and relevance based on the knowledge of the temporal context. The various embodiments compose the application intent descriptions into an integrated user experience employing the environmentally appropriate visual language and interaction patterns. Application intent transitions and orchestration are also handled by the various embodiments. At run-time, the application intent descriptions can he received by the various embodiments using a services gateway as a message or notification receiver.
"Further, the experience framework as described herein manages transitions caused by messages. notifications, and changes in the temporal. context. The experience framework of an example embodiment orchestrates the tasks that need to be made available simultaneously for a given temporal context change, manages any state transitions, such that the experience is consistent, complete, and continuous. The experience framework manages these temporal context changes through an equivalent of a composite or multi-modal dialog as opposed to a modal user interface that the foreground application presents in conventional systems.
BRIEF DESCRIPTION OF THE DRAWINGS
"The various embodiments is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which:
"FIG. 1 illustrates an example set of components of the adaptive experience framework of an example embodiment;
"FIG. 2 illustrates a task set in an example embodiment of the adaptive experience framework;
"FIG. 3 illustrates a task hierarchy in a task set if an example embodiment of the adaptive experience framework;
"FIG. 4 illustrates input interaction resources and output interaction resources of an example embodiment of the adaptive experience framework;
"FIG. 5 illustrates the components of as task model in an example embodiment of the adaptive experience framework;
"FIG. 6 illustrates a notification module of an example embodiment of the adaptive experience framework;
"FIG. 7 illustrates a reference model of an example embodiment of the adaptive experience framework;
"FIG. 8 illustrates a reference architecture of an example embodiment of the adaptive experience framework;
"FIG. 9 illustrates the processing performed by the task model in an example embodiment:
"FIGS. 10 and 11 illustrate the processing performed by the adaptive experience framework in an example embodiment;
"FIG. 12 illustrates an example of the adaptive experience framework in a vehicle environment in an example embodiment;
"FIG. 13 is a processing flow chart illustrating an example embodiment of a system and method for providing an adaptive experience framework for an ambient intelligent environment; and
"FIG. 14 shows a diagrammatic representation of machine in the example form of a computer system within which a set of instructions when executed may cause the machine to perform any one or more of the methodologies discussed herein."
For additional information on this patent application, see: Madhok, Ajay; Malahy, Evan; Morris, Ron. Adaptive Experience Framework for an Ambient Intelligent Environment. Filed
Keywords for this news article include:
Our reports deliver fact-based news of research and discoveries from around the world. Copyright 2014, NewsRx LLC
Most Popular Stories
- Criminal Investigation Opened Into James Foley's Death
- Koch Brothers Took Genes, Money in Different Directions
- Rocket Explodes During U.S. Test Flight
- Short-Term Loans Comes at a Heavy Price
- Is Diversity in the Eye of the Beholder?
- Apple Stock Bounces Back Big Time
- 'Mythbusters' Build Team Gets the Boot
- Jennifer Lopez Would Marry Again
- Florida Judge Rules in Favor of GOP Voter Map
- Investors Betting on ECB Stimulus Measures