quick_recipes (779892), страница 44
Текст из файла (страница 44)
It is not possible to play the video clip beforeCVideoPlayerUtility finishes the preparation.The following steps explain how to play a video clip using CMdaVideoPlayerUtility:• Create an observer class, which implements MVideoPlayerUtilityObserver.• Create an instance of CVideoPlayerUtility and pass it a reference to the observer.• Open the video clip by calling CVideoPlayerUtility::OpenFileL().• Wait until MVideoPlayerUtilityObserver::MvpuoOpenComplete() is called.• Prepare the video clip to be accessed by calling the CVideoPlayerUtility::Prepare() method.• Wait until MVideoPlayerUtilityObserver::MvpuoPrepareComplete() is called.• Call CVideoPlayerUtility::Play() to start video playback.• When the video has been played completely, MVideoPlayerUtilityObserver::MvpuoPlayComplete() will be called.In the full sample code for this recipe, the CSimpleVideoPlayer class implements MVideoPlayerUtilityObserver.Discussion: The factory methods of CSimpleVideoPlayer require oneparameter to be passed.
The type is CCoeControl&, and the parameteris the control where the video is to be displayed.Let’s take a look at the constructor of CVideoPlayerUtility to seehow it is going to be used. The constructor of CVideoPlayerUtilityis defined as follows:CVideoPlayerUtility* NewL(MVideoPlayerUtilityObserver& aObserver,TInt aPriority, TMdaPriorityPreference aPref,RWsSession& aWs,MULTIMEDIA249CWsScreenDevice& aScreenDevice,RWindowBase& aWindow,const TRect& aScreenRect,const TRect& aClipRect);The aObserver parameter is the reference to the observer which willreceive notifications.The aPriority and aPref parameters are the video client’s priorityand preference, respectively.
Like CMdaAudioPlayerUtility, theyrequire MultimediaDD capability. The discussion of MultimediaDD isnot in the scope of this book.The aWs parameter is the reference to the window server session.You can use the shared window server session defined in CCoeEnv::WsSession(). For example:RWsSession& wsSession = aControl.ControlEnv()->WsSession();The aScreenDevice is the reference to the software device screen.You can usually use the default screen device owned by CCoeEnv, whichis CCoeEnv::ScreenDevice().
For example:CWsScreenDevice* screenDevice = aControl.ControlEnv()->ScreenDevice();The aWindow parameter is the handle of the window for the video. Youcan use CCoeControl::DrawableWindow() to get the client-sidehandle of a control:RWindowBase& windowBase = aControl.DrawableWindow();The aScreenRect parameter is the rectangle where the video isdisplayed on the screen. The position is relative to the origin of thescreen, not to the origin of the control.The aClipRect parameter is the area of the video clip to be displayed.In most cases, this parameter has the same value as aScreenRect,which means the whole area of the video is displayed. The same asaScreenRect, the position is relative to the origin of the screen, not tothe origin of the control.The supported codecs and formats of CVideoPlayerUtilitydepend on the installed plug-in on the device.
For example, most Symbian OS devices support the H.263 codec, but only some support theH.264 codec.Note that there are some other methods to open video clips from othersources, such as CVideoPlayerUtility::OpenUrlL().The CVideoPlayerUtility::SetDisplayWindowL() methodchanges the display window of the video. It can be used to display the250SYMBIAN C++ RECIPESvideo playback to another control. It can also be used to change the areawhere the video is played.
Its parameters are the same as those used inthe constructor of CVideoPlayerUtility.What may go wrong when you do this: There is one additional thingthat you need to take care of; that is, the possibility of a screenorientation change.
For example, some S60 devices allow orientationchange by opening the cover of the device. If you don’t respond to theorientation change, your video will not be displayed properly.What you need to do is override the CEikAppUi::HandleResourceChangeL() method and then call CVideoPlayerUtility::SetDisplayWindowL() to update the position and sizeof the video playing area.4.7.2.3Audio StreamingAmount of time required: 30 minutesLocation of example code: \Multimedia\AudioStreamingRequired libraries: mediaclientaudiostream.libRequired header file(s): MdaAudioOutputStream.hRequired platform security capability(s): NoneProblem: You want to play an audio clip in streaming mode.
The audioclip is read chunk by chunk incrementally. The audio streaming may beneeded; for example, you want to process the audio from the file beforeyou play it or you are getting the audio clip from the network. An Internetradio is an example of an application that needs audio streaming.Solution: The class to stream audio is CMdaAudioOutputStream.The following lists the steps to stream audio:• Create an instance of CMdaAudioOutputStream, which implements MMdaAudioOutputStreamCallback.• Open the audio stream package by calling CMdaAudioOutputStream::Open().• Once the stream has been opened, the callback method, MMdaAudioOutputStream::MaosOpenComplete(), is called.• Start streaming audio by calling CMdaAudioOutputStream::WriteL().• Once the buffer has been copied to the lower layers of MMF,the callback method, MMdaAudioOutputStream::MaosBuffer-MULTIMEDIA251Copied(), is called.
Now, we can call WriteL() to copy the nextbuffer.In our recipe, the CAudioStreamPlayer class implements MMdaAudioOutputStream.Discussion: The CMdaAudioOutputStream class plays the audio usingthe sound driver. It relies on hardware DSP (Digital Signal Processing)codecs.
No MMF controller plug-in is involved. The supported formatsdepend on the device’s DSP. All Symbian OS devices support PCMformats. Some of them support other compressed formats, such as AMRand MP3.The CMdaAudioOutputStream::Open() requires one parameterwith the type of TMdaPackage*. You can ignore this parameter becauseit is maintained for historical reasons only.AfterMMdaAudioOutputStreamCallback::MaoscOpenComplete() is called, you are ready to stream the audio clip. There areseveral properties that need to be set to match the data you are streaming,such as format, sampling rate and number of channels.The format is set by calling the CMdaAudioOutputStream::SetDataTypeL() method.
It requires one parameter in the type ofTFourCC, which is the FourCC (Four-Character Code) of the audio:TRAP(err, iAudioStream->SetDataTypeL(KMMFFourCCCodePCM16));Note that we use a TRAP because we call this method inside anon-leaving method, MaoscOpenComplete().The list of FourCC constants can be found in \epoc32\include\mmf\common\MmfFourCC.h.Here are some examples of the predefined FourCC constants:• KMMFFourCCCodePCM8 = ( ' ', ' ', 'P', '8' )• KMMFFourCCCodePCM16 = (' ', 'P', '1', '6')• KMMFFourCCCodeAMR = (' ','A','M','R')• KMMFFourCCCodeAAC = (' ','A','A','C')• KMMFFourCCCodeMP3 = (' ','M','P','3').You can use CMMFDevSound::GetSupportedOutputDataTypesL() to get the list of supported FourCCs on a particular device.Note that some SDKs may not distribute the header file of CMMFDevSound.The sampling rate and number of channels can be set by callingCMdaAudioOutputStream::SetAudioPropertiesL().252SYMBIAN C++ RECIPESThe possible values for the sampling rate and number of channels aredefined in TMdaAudioDataSettings.
There are several sampling rates,starting from ESampleRate8000Hz to ESampleRate64000Hz. Thereare two supported channels: EChannelsMono and EChannelsStereo(no surround sound).After all the properties have been set up, you can start writing thedata stream to the audio device. This is done by calling CMdaAudioOutputStream::WriteL(). It requires a parameter with the type ofTDesC8&:TRAP(err, iAudioStream->WriteL(iBuffer));The optimal size of the buffer depends on your needs and the audioformat. Ideally, you want to use the smallest possible amount of memorywithout risking a buffer underflow situation.Tip: In order to avoid an out-of-data situation, you may want to usemore than one buffer. For example, you may use two buffers.
You passone buffer to the audio stream and use the other buffer to read fromthe file. When the one buffer is passed to the audio device, you canreplace it with the next data. The example in this book uses one bufferfor simplicity reasons.In some applications that have a heavy thread, such as games, alarge buffer may not solve the underflow situation. In this case, youmay consider running the audio stream in a separate thread with higherpriority.Some audio-intensive applications may need to create proper adaptive buffer management algorithms to deal with different performancesituations.You can stop the streaming by calling CMdaAudioOutputStream::Stop().