Jump to content

User:Edusoladana/MUSIC

From Wikipedia, the free encyclopedia
MUSIC
Original author(s)MUSIC consortium
Operating systemCross-platform
TypeMiddleware
LicenseOpen source
Websitewww.ist-music.eu

MUSIC, acronym of Self-Adapting Applications for Mobile USers In Ubiquitous Computing Environments, is a project that develops an open-source software development framework for the development of self-adapting, reconfigurable software that adapts to the highly dynamic user and execution context, and maintains a high level usefulness across context changes. Context-aware applications are capable of exploiting knowledge of external operating conditions, and they are self-adaptive if they adapt at runtime to varying contexts, like changing user needs and operating environments.

The MUSIC project belongs to the EU's Sixth Framework Programme for Research and Technology Development (RTD), inscribed into CORDIS's Information Society Technologies (IST). The project started in October 2006 and will end in March 2010.

Introduction to MUSIC[edit]

MUSIC is an open platform technology for software developers, addressing a paradigm of "any network, any device" with relevant content and right context in a secure and trustworthy manner. MUSIC will provide technology for the development of innovative mobile applications, which will play a role in the lives of European citizens in the future. As the public gets accustomed to the use of mobile services, expectations about which services should be provided, where they should be available and how well they should be adapted to the users’ expectations has become more topical and demanding. However, the current range of mobile devices (e.g. PDAs, smart phones, GPS, etc.) and the variety of infrastructures has exacerbated the challenges in building and maintaining such services in a user-friendly way. Because of that, software developers must deal with an enormous number of issues related to configuration, operations, maintenance and change management in order to produce systems that can dynamically, securely and automatically adapt to public expectations in different scenarios and circumstances.

MUSIC will provide a design methodology and distributed system architecture for the design and implementation of self-adapting applications in ubiquitous computing environments. This will be complemented with enhanced modeling languages for the specification of context dependencies and adaptation capabilities, supported by model specification, validation and simulation tools. This platform will be used to develop trial services, based on a set of challenging application scenarios with real market potential, having a central role: as sources of requirements, to assess technical adequacy of the results, and to promote the results.

Previous research[edit]

The MADAM project[edit]

The MUSIC project grows on the basis of the MADAM project. The IST project MADAM, Mobility and ADaptation enAbling Middleware, provided software engineers with modeling language extensions, tools and middleware that collectively foster the design, implementation and operation of innovative applications and services for the mobile user and worker. To achieve this objective the project has studied the adaptation requirements of mobile applications and developed a theory of adaptation. A set of reusable adaptation strategies and adaptation mechanisms, based on dynamically reconfigurable component architecture has been developed. The project has also developed modeling language extensions and tools enabling application designers to specify adaptation capabilities at design time.

MADAM has provided prototype implementations of the reference middleware platform as well as the modeling tool extensions. Pilot applications have been developed to validate the approach and the prototype implementations.

The MADAM project terminated in the spring of 2007.

Relation between MADAM and MUSIC[edit]

The MUSIC technical approach is strongly inspired by preliminary results from the MADAM project. MUSIC will extend and generalize the experimental solutions developed in MADAM and take them to the level of maturity necessary to facilitate uptake by the European software industry. There are several innovations as compared with MADAM, including generalization to ubiquitous computing and Service-Oriented Architectures (SOAs).

Objectives[edit]

MUSIC adaptative system

MUSIC will provide an open platform that makes it technically and commercially feasible for the wider Information Technology (IT) industry (not just telecommunications operators) to develop innovative mobile applications which:

  • Are context-aware: understand user "context" in the widest sense, including factors related to users themselves (role, location, environmental conditions etc.) and factors related to changing availability of computing and communications facilities.
  • Are self-adapting: dynamically adapt functionality and internal implementation mechanisms to changes in context.
  • Are inherently distributed in nature, and may involve direct interactions between multiple users.
  • Are aimed primarily at mobile users, but may include stationary users too.
  • Address extra-functional aspects (e.g. security, dependability, ...) according to user needs.
  • Can be described as "innovative" either because they provide users with entirely new services or because they make traditional services available in a practical and usable form in a mobile environment.

The primary result of the project will be the open platform for the development of innovative mobile applications, made up of the following main components:

  • A design methodology for self-adapting applications.
  • A distributed system architecture forming a solid basis for the design and implementation of self-adapting applications in ubiquitous computing environments.
  • A comprehensive open-source software development framework that facilitates the development of self-adapting, reconfigurable software that seamlessly adapts to the highly dynamic user and execution context, and maintains a high level of usability and usefulness across context changes.
  • Enhanced modeling languages for the specification of context dependencies and adaptation capabilities, supported by model specification and validation tools.
  • Middleware and infrastructure services supporting functionality that is commonly needed in the type of adaptable mobile applications addressed in the project. An example would be support for distributed decision making about collaborative adaptations.
  • A prototype test and simulation environment, to allow developers to observe and analyse the effects of context changes and adaptations, and so carry out adaptation tuning.

Self-adaptation and context-awareness[edit]

Self-adaptation[edit]

Self-adaptive software systems are able to adapt at runtime to changing operating environments. Self-adaptation is today complex and costly to implement, and has been applied in particular domains where systems must have guaranteed dependability, for instance telecom exchanges or space vehicles. However, self-adaptation has become a requirement for more and more software systems, including mobile systems.

MUSIC will reduce the complexity of the development and implementation of these applications.

Context-awareness[edit]

Today, more and more applications become context-aware. For example, one application of context awareness is the contextual advertisement, in which website visitors are presented with advertisements that will most likely have an impact on them, judging on their contextual situation: which country and city they are visiting from, which terms did they recently search for, etc.

Context awareness is one of the most important ingredients for achieving the ubiquitous computing paradigm [1]. In this paradigm, people are protected from being overloaded with the requirement for interacting with computer systems, partly by allowing the systems to indirectly and autonomously self-configure. In this way, the required interaction (and thus human attention) is minimized. Apparently, in order for the technology to recede to the background (of our consciousness) new methods must be invented so that the required decisions are taken in an automated manner.

In the MUSIC project, one of the main targets is to develop systems that can autonomously and automatically adapt themselves, with the constant aim of optimizing the Quality-of-Service (QoS) offered to the end users. This is partly achieved by enabling the systems to continuously and automatically collect information about the status of the user and their corresponding applications, with the aim of enabling adaptation decisions without the direct intervention of the user. This can be achieved by collecting information from the context of both the user and the applications. When this information is sufficiently rich and appropriately modeled, then the system can frequently make decisions that benefit the users, without requiring any explicit intervention by them.

The context which MUSIC will take into account covers a large amount of information, which can be classified into three categories [2]:

  • Computing context[3]: describes the collection of the computing devices in the system and the communication links between them, by providing information about their current state (e.g. usage, load), capabilities and configurability. For example, information related to available networks and services, the device screen size and orientation, the available memory and battery, ...
  • Environmental context: in MUSIC, it describes conditions of the environment in which the user and the host computers reside and move. In order to detect and give a measurement for such kind of context data, dedicated hardware sensors are employed. For example, the user's location, which allows to know the objects near to the user; ilumination, noise, ...
  • User context: the users may specify their preferences regarding the configurable properties of their device. This can include personal header data, company and workplace, hobbies, necessities, ...

The MUSIC middleware[edit]

The development of the MUSIC middleware is the core of the project. The middleware will have to evaluate the context described before and then select automatically the best available configuration of the applications that fits the requirements given by the context. This process has to be transparent to the user.

The middleware is developed in two parts: the context middleware and the adaptation middleware.

Context middleware[edit]

The MUSIC context middleware is responsible for monitoring context, managing context information and detecting context changes that should cause adaptation. MUSIC general concept of context is needed if we want to exploit the full power of ubiquitous computing environments. Clearly, we need commonly accepted descriptive means to specify the current context. All context sources must share the same context ontology, which will be used by the context data consumer and by the context broker. These roles can be defined as follows:

  • The context provider: it is either a network element or an end user device which provides context data.
  • The context consumer: it is the application layer (either server or client side) which uses or elaborates context information. This role can be played also by the reasoning engine, which is an intermediate element between the context provider and the application.
  • The context broker: it is a server element which collects context data (acting as context consumer) and distributes them (as a context provider) on request. Often the context broker adds value to the individual data which it collects.

A distributed context service in the MUSIC middleware monitors the context and provides information to the adaptation middleware when the context changes. The context service delivers a defined aggregation and mapping of low-level events to higher-level context change notifications. Therefore, it uses a reasoning engine that will understand (semantically), compose and provide the application layer with a fully fledged context representation and learning techniques.

Furthermore, context information needs to be protected against malicious usage. Security and privacy provisions need to be part of the system design and need to be considered in those MUSIC components that produce and forward context data.

Adaptation middleware[edit]

The adaptation middleware is capable of adapting an application and itself dynamically to the context in which it executes, provided by the context middleware. Adaptation can mean that application components are changed or reconfigured. In addition to what MADAM has demonstrated, MUSIC will model and realise service bindings as well as extended situational contexts as part of the context dependencies, based on a service-oriented approach. It can mean that if some appropriate service is detected at runtime in the execution environment, it can automatically be integrated and replace another software component. This flexible dynamic reconfiguration may apply to service components both at the platform and the application level. The same context may affect different applications quite differently. In order to make application-tailored adaptation decisions, knowledge about the individual application such as the dependency between the application parameters, the available resources and the performance is needed. This knowledge might be only acquired during the runtime execution. The middleware needs to support the application to capture this knowledge for making adaptation decisions. Security and dependability considerations will control these reconfiguration decisions and activities. The middleware needs to support the specification and enforcement of security policies.

The MUSIC Studio[edit]

The middleware is just a tool which has to be configured in order to adapt to the application requirements. In order to simplify the developer's work, a development environment will be available: the MUSIC Studio. This platform will be used to develop applications using the MUSIC technology. Furthermore, it facilitates substantially the reuse and evolution of adaptive software solutions by offering a repository for reusable software components and patterns. The MUSIC Studio will be built on top of the Eclipse framework and will be made publicly available as an open source project.

The objetive is to support all tasks of the application development, integrating all MUSIC tools into a comprehensive software development environment that makes the development of adaptive applications for ubiquitous computing more efficient and more convenient. More specifically, the MUSIC Studio will integrate:

  • Modeling tools, which support the creation of adaptation models. MUSIC will be based on model-driven development, so a visual model building tool is needed to build models of self-adaptive applications interactively. MUSIC will use UML as the modeling language. The Eclipse Modeling Framework will be the general foundation for the MUSIC modeling and development tool architecture.
  • Transformation tools, that transform platform-independent adaptation models to platform-dependent models and to code, using Java as language. In order to generate, deploy and validate the adaptation-enabled applications we need to build several specific model transformations. The result of the transformation steps will be platform-dependent models targeted at the MUSIC middleware and model-testing facilities.
  • Simulation, testing and tuning tools for the applications. A model-driven development approach nicely supports the validation, testing and tuning of design decisions by simulation of application models at early stages of the development process. These tools will help to discover design flaws and performance properties. Based on these insights suggestions for application tuning will be provided by the MUSIC tuning tool, e.g. conflicting adaptation policies, resource usage conflicts, conflicting security rules, etc. will be discovered and repaired. A number of tools/tool platforms for these tasks are generally available (e.g. Eclipse Test and Performance Tools Platform). These need to be extended and adapted for the MUSIC purposes.

References[edit]

  1. ^ M. Weiser: The Computer for the Twenty-First Century, Scientific American, September 1991, pp. 94-10
  2. ^ G. Chen, D. Kotz, A Survey of Context-Aware Mobile Computing Research, Technical Report: TR2000-381, Dartmouth College, Hanover, NH, USA, 2000
  3. ^ B. Schilit, N. Adams, and R. Want, Context-aware Computing Applications, IEEE Workshop on Mobile Computing Systems and Applications, Santa Cruz, California, December 1994, IEEE Computer Society Press, pp. 85-90

External links[edit]