Wiley.Games.on.Symbian.OS.A.Handbook.for.Mobile.Development.Apr.2008 (779888), страница 29
Текст из файла (страница 29)
It’s not always true that a hardwareaccelerated codec is faster than a soft codec, because it very muchdepends on the speed of the CPU, the co-processor, and the quality ofthe codec. The chief advantage of using hardware acceleration is that theCPU can do other things in parallel while not burdened by the codectask. Hardware accelerators may also provide better energy efficiency116GRAPHICS ON SYMBIAN OSif they consume less power than having the CPU running at full speedduring audio/video playback.From an API point of view, Symbian OS abstracts video playback withthe Video Player Utility, which we’ll discuss next.3.12.1 CVideoPlayerUtilityVideo Player Utility API OverviewLibrary to link againstmediaclientvideo.libHeader to includevideoplayer.hRequired platform security capabilities MultimediaDD (for settingpriority)Key classesCVideoPlayerUtility,MVideoPlayerUtilityObserverAs Aleks describes in the next chapter, the Symbian multimedia framework (MMF) provides a fairly straightforward utility for playing audioclips; it can also be used for rendering video clips to the screen.
Thevideo player framework can play back formats via MMF plug-ins and, aswith other MMF plug-ins, they are sourced and integrated by the handsetmanufacturer. Common video formats that are typically supported onSymbian smartphones include:• RealVideo 8,9,10• H.263 and MPEG-4• H.264You should take care to ensure that any video clips used in a gameare encoded with a codec common across target phones, and also thatplayback is tested on a variety of phones for performance.The class CVideoPlayerUtility contains a fairly exhaustiveNewL() factory method that allows clients of the API to specify therectangle in which the video will be rendered, by supplying the window and rectangle. The parameters passed to this method are shown inTable 3.6.PLAYING VIDEO CLIPS117Table 3.6 The parameters required for CVideoPlayerUtility::NewL()ParameterDescriptionMVideoPlayerUtilityObserver&aObserverA client class to receivenotifications from the video player.TInt aPriorityThis client’s relative priority.
This isa value betweenEMdaPriorityMin andEMdaPriorityMax andrepresents a relative priority. Ahigher value indicates a moreimportant request. This parameteris ignored for applications with noMultimediaDD platform securitycapability.TMdaPriorityPreference aPrefThe required behavior if a higherpriority client takes over the soundoutput device. One of the valuesdefined byTMdaPriorityPreference.RWsSession& aWsThe window server session.CWsScreenDevice& aScreenDeviceThe software device screen.RWindowBase& aWindowThe display window.const TRect& aScreenRectThe dimensions of the displaywindow in screen coordinates.const TRect& aClipRectThe area of the video clip to displayin the window.The code below initiates the player with a CCoeControl-derivedview.void CGfxVideoPlayer::InitControllerL(){iPlayer = NULL;iPlayer = CVideoPlayerUtility::NewL(*this,EMdaPriorityNormal,EMdaPriorityPreferenceNone,iView->ClientWsSession(),iView->ScreenDevice(),iView->ClientWindow(),iView->VideoRect(),iView->VideoRect() );118GRAPHICS ON SYMBIAN OS// KVideoFile contains the full path and// file name of the video to be loadediPlayer->OpenFileL(KVideoFile);}Figure 3.22 shows that the screen rectangle used to render video isspecified in terms of screen coordinates.
The code below shows how touse the PositionRelativeToScreen() method of CCoeControlto calculate the screen coordinates of the control.ScreenWindowScaledaScreenRectVideo FrameFigure 3.22 The video frame will be scaled to fit the rectangle provided (in coordinatesrelative to the screen)void CVideoPlayerControl::ConstructL(const TRect& aRect){iPlayer = CGfxVideoPlayer::NewL(this);CreateWindowL();SetRect(aRect);iVideoRect = Rect();TPoint point = PositionRelativeToScreen();// Rect now converted to screen coordsiVideoRect.Move(point.iX, point.iY);ActivateL();}After this call has succeeded, the supplied MVideoPlayerUtilityderived class will be called back through a series of events. Table 3.7 liststhe events received via the callback and the required response from thegame.For the case of simple playback, the callbacks just advance the utilityto the next state, until finally Play() can be called to start showing videoframes.
This is demonstrated in the code below.PLAYING VIDEO CLIPS119Table 3.7 MVideoPlayerUtility callbacksEventDescriptionMvpuoPrepareComplete(TIntaError)Notification to the client that theopening of the video clip hascompleted successfully, or otherwise.This callback occurs in response to acall to CVideoPlayerUtility::Prepare().The video clip may now be played, orhave any of its properties (e.g.,duration) queried.MvpuoPlayComplete(TIntaError)Notification that video playback hascompleted. This is not called ifplayback is explicitly stopped bycalling Stop().MvpuoOpenComplete(TIntaError)Notification to the client that theopening of the video clip hascompleted, successfully, or otherwise.void CGfxVideoPlayer ::MvpuoOpenComplete(TInt aError){if(aError == KErrNone)iPlayer->Prepare();}void CGfxVideoPlayer ::MvpuoPrepareComplete(TInt aError ){if(aError == KErrNone)iPlayer->Play();}If an error occurs at any stage, it’s up to the class to decide what todo and whether to inform the user of the failure.
Typical failures will beKErrNoMemory or KErrNotSupported if there is no codec availableon the device to render the video.Figure 3.23 shows the video example mid-playback. Notice that thevideo has automatically been scaled to fit the application rectangle passedto it. The codec preserves the aspect ratio of the video when scaling,which results in black bars on the top and bottom. This is sometimescalled ‘letterboxing.’And that’s about it really, once the video has completed, MvpuoPlayComplete() will be called and the class can safely be deleted.120GRAPHICS ON SYMBIAN OSFigure 3.23 Example of video playbackCombining Video with Other ElementsThere is currently no standard way of implementing overlays on top ofvideo since the video rendering code may use DSA or hardware support torender frames, leaving no opportunity for an application to draw overlayson top.
Video always fills the rectangle given to it, but there is no problemcombining that area with other graphical elements as demonstrated inFigure 3.24.The window that the rectangle occupies can be any size and does nothave to fill the screen (as demonstrated in the mock up in Figure 3.24).It’s easy to incorporate graphics around the edges of a video by usingnormal CoeControl drawing methods.Figure 3.24 A video played with ornamentsSUMMARY1213.13 Less Useful APIsThe following APIs may, on the surface, seem useful to the game developerbut are generally not intended for that use.
You have been warned!3.13.1 SpritesThe term sprite is heavily loaded towards game primitives. A sprite incomputing usually refers to a sort of overlay icon which appears overthe background and which takes care of preserving the background as itmoves. On Symbian OS, sprites run in the context of the window serverand are only really designed for mouse pointer effects.3.13.2 2D Accelerated Drawing OperationsIf 2D hardware acceleration is implemented by a handset manufacturer,then the Symbian OS bitmap and window server code will use acceleration internally. Direct access using the graphics accelerated bitmapsfunctions is unlikely to work. I recommend that you use OpenGL andOpenVG instead.3.13.3 CImageDisplayThe CImageDisplay class from the MMF may be useful for displayinganimated GIFs and for scaling images for display on the screen, butimplementations of the API are not available on S60 3rd Edition devices.3.14 SummaryThis chapter has introduced the various fundamental graphics frameworksavailable in Symbian OS and provided examples of how to:• draw using WSERV and the CONE graphics frameworks• access the screen directly using direct screen access (DSA)• use double buffering techniques on Symbian OS• play back video clips• draw international text• deal with different screen orientations• cope with scaling for different screen resolutions.Having got to the end of the chapter, you should now better understandthe relevance of the Symbian graphics APIs to a games environment, andyou can use the examples as starting points for game rendering code.4Adding Audio to Games on Symbian OSAleks Garo PamirThis chapter covers a variety of audio technologies supported by SymbianOS and demonstrates different techniques to create sound and music onSymbian OS based devices.
It presents an overview of the audio APIsrelevant to game development in Symbian OS v9.1 and later. Gamedevelopers can use these APIs to play sound effects and backgroundmusic in mobile games developed on Symbian OS.4.1 IntroductionAudio has always been neglected in game development since the beginning. Developers have focused more on the graphics capabilities andless on the sound and music capabilities of video game devices (as the‘video’ in video game implies).
Flashy visuals, the number of colors, andthe polygon counts in 3D games have always been the main attractions.Unfortunately, this unwritten rule hasn’t changed much in SymbianOS devices, even though phones have been primarily sound-processingdevices from the beginning. The sound-processing functions in phoneshave actually been delegated to special processors inside the phonesthat only process data using specific voice-only codecs. Basic ringtoneoperations were the only audio functionality performed by the PDAportion of the phone.Audio support in Symbian OS phones has only started to developin recent years. One factor was the success of the ringtone businessand the increasing popularity of more technically advanced (polyphonic,digitized, MP3) ringtones.