symba (779893), страница 6
Текст из файла (страница 6)
The range targets different segments with devices optimizedfor music, still photography, video and games. Figure 1.3 shows a rangeof some of the most recent Symbian smartphones.At the time of writing, the current flagship device, the Nokia N96,boasts HSDPA for download speeds of up to 3.6 Mbps, WiFi, a fivemegapixel camera with auto-focus and flash, built-in GPS, DVB-H TVbroadcast receiver, TV out, support for all of the popular media formatsand much more.Similarly, other manufacturers are keen to promote the multimediacapabilities of their handsets. For example, Sony Ericsson is leveragingestablished brands, Walkman and Cybershot, from Sony’s music playerand camera businesses to enhance the perception of its mobile phoneofferings.
Motorola is actively marketing the advanced video capabilitiesof its MOTORIZR Z8 and MOTO Z10 smartphones, emphasizing theircapability to capture and edit video footage without the need for a PC.Motorola also bundes full-length feature films on removable storage withthose products in some markets.6Games on Symbian OS by Jo Stichbury et al. (2008). See developer.symbian.com/gamesbook for more information.7 See www.nseries.com for more details.8INTRODUCTIONFigure 1.3 A range of Symbian smartphones (June 2008)Smartphone multimedia functionality will continue to advance inthe years ahead with new hardware in the form of advanced graphicsand video accelerators, audio data engines, and faster processors withmultiple cores.
These will enable high-definition video output, rich gamesexperiences, more advanced audio effects and video editing as well aslonger playback times, to rival existing dedicated players. Of course, tokeep Symbian smartphones on the cutting edge of mobile multimedia,the multimedia subsystem in Symbian OS has had to keep evolving.1.5 Evolution of the Multimedia Subsystem in Symbian OSHaving explained the background against which Symbian OS has developed, we’re going to take a closer look at its multimedia subsystem andhow it has evolved, in order to better understand its current design anduse. Readers who are either familiar with the history or not interested inthe technical details can safely skip ahead to Section 1.6 for a peek intothe future!If you’re not already aware of the history of Symbian OS releases,then we’d recommend taking a look at a visual timeline which you canEVOLUTION OF THE MULTIMEDIA SUBSYSTEM IN SYMBIAN OS9find on the wiki site for this book at developer.symbian.com/multimediabook wikipage.In order to explain why many radical improvements were madeto the multimedia subsystem in Symbian OS v7.0s and then furtherimprovements in the later versions, it is necessary to give an overviewof the subsystem in previous versions of the operating system and theproblems they faced.1.5.1 The Media ServerIn early Symbian OS releases, v6.1 and v7.0, all multimedia processingwas performed through the Media Server.
This was a standard Symbian OSserver; it operated in a single thread using an event-driven framework (anactive scheduler with multiple active objects) that provided all multimediafunctionality.The server supported the playback and recording of audio, along withthe encoding, decoding and manipulation of still images. Symbian OSshipped with support for an array of audio and image formats, whichcould be extended by writing proprietary plug-ins.In order to use the server, a client could either explicitly instantiate aconnection to the server or allow the client APIs to provide a connectionautomatically. Each client would supply an observer class to the server,which would allow the server to communicate by passing messages backto the calling application or library.The server kept a list of clients and concurrently cycled through themultimedia requests.
This meant that a number of different clients coulduse the server at the same time, and, for example, enabled an applicationto play back audio whilst simultaneously decoding images for display.Although this sounds ideal, the practical issues in producing this kind ofbehavior were complicated and fraught with difficulties.If, for example, an application actually did try to use two parts of theserver’s functionality at the same time, the latency created by havingone process controlling both could make the system virtually unusable.For instance, an intensive task, such as decoding an image, wouldinterfere with any real-time multimedia task. The situation was madeworse when poorly written third-party plug-ins were used (they wereoften converted from code not originally written for Symbian OS andfailed to fit the co-operative multi-tasking model). The plug-in frameworkitself was also extremely complicated to write for, which did not improvethe situation.Handling the demands of high-performance multimedia applicationssuch as streaming video, CD-quality audio, mobile commerce or locationbased services, coupled with the fact that connecting to the server couldtake, in the worst case, several seconds, meant improvements werenecessary.10INTRODUCTION1.5.2 The Multimedia Framework: The Beginning of a New EraIn 2001, Symbian began to write an entirely new multimedia subsystemthat would successfully allow different areas of the subsystem to be usedsimultaneously and would provide a lightweight framework, redesignedfrom the ground up for mobile media, with powerful enhancementssuch as multiple threads, format recognition, streaming, and a plugin media component library.
With a new foundation of base mediaclasses and an extensible controller framework, licensees and thirdparty developers could now undertake far more effective multimediaapplication development for Symbian OS.The new architecture was not based on a central server and instead splitup the various multimedia sections so that each application could use thefunctionality it needed independently. It also used multiple concurrentthreads, avoiding the side effects that were seen in the Media Server.The new subsystem retained many of the same client interfaces as theMedia Server, but took advantage of a new plug-in-resolving methodology, known as ECOM (which is discussed further in Chapter 2).
Anarchitecture based on plug-ins provides the flexibility for additions tobe made to the built-in functionality, which allows greater extensibilityby creating support for a wide range of plug-in implementations. Thistranslates into an open multimedia platform, enabling the implementationof specialized proprietary components, by both licensees and third-partydevelopers.The new subsystem was so successful that it was quickly integrated intoSymbian OS v7.0s, which was just beginning its development lifecycle.As development of that release progressed, the audio and video parts ofthe subsystem evolved into what is now known as the Multimedia Framework (MMF). The MMF is a multithreaded framework geared towardsmultimedia plug-in writers. While it retains a subset of the original application programming interfaces (APIs) from v6.1 and v7.0, it also providesnumerous enhancements.The basic structure of the MMF consists of a client API layer, acontroller framework, controller plug-ins and lower-level subsystems thatwill be implemented by a licensee when a new smartphone is created(which are generally hardware specific).As Figure 1.4 shows, still-image processing is handled by a separatepart of the subsystem, the Image Conversion Library (ICL), while stillimage capture and camera viewfinder functionality are handled by theonboard camera API (ECam).
Neither of these are part of the MMF. Videorecording, despite effectively being the rapid capture and processing ofmultiple still images, is typically implemented using an MMF camcorderplug-in,8 which drives the necessary encoding process and controls8 On smartphones with high-resolution video recording capabilities, the capture andencoding is often performed on a dedicated hardware accelerator.EVOLUTION OF THE MULTIMEDIA SUBSYSTEM IN SYMBIAN OS11ApplicationsMultimediaFramework(MMF)OnboardCamera API(ECam)ImageConversionLibrary(ICL)Hardware AbstractionFigure 1.4 Symbian OS v7.0s Multimedia subsystemtransfer of the video stream from the camera and microphone to a file.Such an implementation was chosen to extend the ability of the camcorderapplication to use the pool of available encoding formats via the MMF.In Symbian OS v8, the multimedia framework provides audio recordingand playback and audio streaming functionality.