symba (779893), страница 11
Текст из файла (страница 11)
The actual implementation of the underlying camerasoftware is not supplied by Symbian OS as it is highly dependent on thehardware architecture of each mobile device. Each handset manufacturerprovides a separate camera implementation, providing access to it throughthe onboard camera API, CCamera API. This gives application developersa consistent interface regardless of the actual device on which the camerais present.The onboard camera API can be accessed directly by the clientapplication for still image capture, viewfinder and settings control.
Itcan also be accessed indirectly, via the MMF, which uses it to capturevideo frames before encoding them. For still image capture it is fairlystraightforward: the client application requests image capture directlythrough the onboard camera API. In the case of video capture, theclient application usually accesses the camera API directly, to control theviewfinder and video capture settings, but also indirectly, by sharing ahandle to the camera with the MMF for video recording.30MULTIMEDIA ARCHITECTUREIf you are writing an application that uses the camera of the mobilephone, I invite you to look at Chapter 3, which contains more informationabout the onboard camera API, with detailed explanations and examplesof how to use it.2.4.5 TunerThe Tuner component provides an API to control tuner hardware for FMand Radio Data System, if it is present on the Symbian smartphone.
Forexample, it can be used to scan for a specific radio station and playand record received audio signals. From v9.1 of Symbian OS, Symbianprovides the Tuner API definition. Device manufacturers can choose toimplement the Tuner API on smartphones that contain a radio tuner.6 Sofar it has only been used for FM radio but could also support digital radiobroadcast services as well.The Tuner component uses the MMF controller plug-ins for formats andaudio data types and the MDF (more specifically DevSound) for playbackand recording functionality.
Figure 2.6 shows how the Tuner componentinteracts with the MMF Controller Framework and other components.Client applicationTunercomponentClient API (CMMTunerUtility andrelated classes)Tuner(s)MMF controller frameworkTuner hardwareFigure 2.6 The tuner and related componentsA tuner device converts radio signals from an antenna into audiosignals and allows the user to select a specific broadcast channel, suchas a radio station.Chapter 7 contains more detailed information about the Tuner component and its API.6S60 currently uses a proprietary tuner implementation and the API is not provided in apublic SDK.
UIQ 3 makes the Tuner APIs public and the necessary class definitions can befound in tuner.h. You must link against tuner.lib to use them.FUTURE MULTIMEDIA SUPPORT312.4.6 RTP and SIPThe Real-time Transport Protocol (RTP) and Session Initiation Protocol(SIP) shown in Figure 2.3 are not counted as part of the multimediasubsystem. However, RTP is the basis for various streaming media andmobile TV services and both enable convergence between multimediaand telephony.They are used by VoIP applications.
For example, SIP is used byapplications such as Gizmo and Truphone. It’s also supported by GoogleTalk.RTP provides end-to-end network transport services for data with realtime characteristics, such as interactive audio and video. It is built ontop of the User Datagram Protocol (UDP). There are two closely linkedparts of the standard: RTP carries data that has real-time properties andReal-time Transport Control Protocol (RTCP) provides feedback on thequality of service being provided by RTP.
These are also commonlyused in conjunction with Real-Time Streaming Protocol (RTSP) to providestreaming multimedia services.More information about RTP and SIP in Symbian OS can be found inthe SDKs.2.5 Future Multimedia SupportIn this section, we take a look at what the future holds for multimediasupport on Symbian OS. Some features are not yet visible to a third-partyapplication developer, as noted in the appropriate sections, and somemay never be made accessible. However, being aware of their existencecontributes to a better understanding of the multimedia subsystem.2.5.1 Metadata Utility Framework (MUF)Symbian OS v9.5 adds a Metadata Utility Framework (MUF) to themultimedia subsystem. The MUF is independent of the existing MMF layerand provides faster metadata access to any media file.
It defines a genericscheme of metadata fields for the most common media file formats andallows the metadata client to select them as required. The MUF supportsalbum art and both DRM-protected and non-DRM-protected files. It doesnot extract metadata for DRM-protected files without rights.The purpose of the MUF is to provide direct and faster access tometadata and to support simple ‘Media File Parser’ plug-ins (ECOM plugins that parse the media files and extract the metadata synchronouslyor asynchronously).
There is a client utility class that provides a simpleapplication-level interface to communicate with the MUF, for use byparser plug-in writers.32MULTIMEDIA ARCHITECTUREMore information on the MUF is available in the Symbian OS v9.5documentation.2.5.2 OpenMAXOpenMAX is a collection of standards defined by the Khronos Group.7Khronos is a member-funded industry consortium focused on the creationof open standard APIs to enable playback and authoring of media on avariety of platforms.
Symbian is a member of the group, as are severallicensees and partners.OpenMAX consists of three layers. The lowest layer is the ‘developmentlayer’ (OpenMAX DL), which defines a common set of functions toenable porting of codecs across hardware platforms. Generally a siliconvendor would implement and optimize these functions for their specifichardware.The next layer up is the ‘integration layer’ (OpenMAX IL), which servesas a low-level interface to the codecs to enable porting of media librariesacross operating systems. This layer would be expected as part of theoperating system and indeed Symbian has added support for OpenMAXIL audio from Symbian OS v9.5 onwards. On future releases of SymbianOS, it is possible that OpenMAX IL will not only be used for audiobut also other multimedia codecs and any other processing unit suchas sources, sinks, audio processing effects, mixers, etc.
However, justbecause support for the standard is provided by the OS, it doesn’t meanthat it will be adopted immediately by device manufacturers. It is likelythat the standard will only be taken into use on future hardware platforms.The highest layer of abstraction is the ‘application layer’ (OpenMAXAL). This defines a set of APIs providing a standardized interface betweenan application and multimedia middleware.
This specification has not yetbeen finalized.OpenMAX IL is not expected to be exposed to third-party developers ina public SDK in the near future. For further details of OpenMAX and othermultimedia standards support on Symbian OS, see Games on SymbianOS by Jo Stichbury et al. (2008), for which more information can befound at developer.symbian.com/gamesbook .2.5.3 New MDF ArchitectureIn conjunction with the adoption of OpenMAX IL, Symbian is restructuring the MDF to take full advantage of the modular ‘processingunit’ model, improve performance and make hardware adaptation easierfor licensees.
The new architecture is divided into three planes – management, control and data. The first two planes are responsible for setting up7 Moreinformation about the Khronos group can be found at www.khronos.org .FUTURE MULTIMEDIA SUPPORT33and controlling the data processing chain, which stays entirely within thehardware adaptation layer. This aligns the multimedia architecture withSymbian’s new networking architecture, FreeWay.The existing DevSound and DevVideo interfaces will be retainedfor compatibility.
However, in order to take advantage of performanceimprovements, MDF clients (such as video controller plug-ins) will have touse new multimedia host process (MMHP) APIs. The MMHP managementAPIs will provide functionality to query, select, configure and connectprocessing units, while control APIs will be used to implement thestandard play, pause, stop, etc. functions. The full MMHP implementation,including video support, is targeted at Symbian OS v9.6.
There is aninterim step for audio in v9.5, known as the Advanced Audio AdaptationFramework (A3F), which will continue to be supported in later versionsfor compatibility. It is unlikely that any of these APIs will be exposed tothird-party developers in the near future but their adoption by licenseeswill be a key enabler for the delivery of high-performance multimediaapplications.2.5.4 OpenSL ESOpenSL ES is another standard from the Khronos Group. It is the sisterstandard of OpenGL ES in the audio area. It has been designed to minimize fragmentation of audio APIs between proprietary implementationsand to provide a standard way to access audio hardware accelerationfor application developers.
OpenSL ES is also a royalty-free open APIand is portable between platforms, like OpenMAX. In fact, OpenMAXAL and OpenSL ES overlap in their support for audio playback, recordingand MIDI functionality. A device can choose to support OpenMAX ALand OpenSL ES together or just one of them.OpenSL ES supports a large set of features, but these features havebeen grouped into three ‘profiles’: phone, music and game. Any devicecan choose to support one or more of these profiles depending on itstarget market. OpenSL ES also supports vendor-specific extensions to addmore functionality to the standard feature set of a profile.The OpenSL ES API is identical to OpenMAX AL, but adds support formore objects (for example, listener, player and 3D Groups) and interfaces(for example, 3DLocation, 3DPlayer, 3DDoppler, 3DMacroscopic) for3D audio support.At the time of writing, the OpenSL ES standard has not been finalizedbut Symbian have announced their intention to implement it when it isavailable.2.5.5 Mobile TVA number of standards have evolved for broadcasting television to mobiledevices.