- Download source code - 247 KB
- Download documentation - 963 KB
Introduction
Ever since I started using VLC Media Player, I was impressed with its capabilities, especially its built-in codecs which require no further installations. After exploring the VLC structure a little further, I found thelibvlc.dll module which is an API for the entire VLC engine, and contains a rich set of rendering, streaming, and transcoding functionality. Libvlc is a native DLL which exposes hundreds of C method calls. The main concept of this article is to provide a .NET API for the libVLC interface so the vast majority of VLC functionality could be utilized in managed applications.
VLC 1.1.x introduced several improvements and fixes detailed here; the most compelling ones are GPU decoding and simplified LIBVLC API with no exception handling. Version 1.1.1 also adds support for the GoogleWebM video format.
P/Invoke
In order to use libvlc in a managed application, it has to be wrapped by some kind of interoperability layer. There are three ways to accomplish this:
- C++/CLI
- COM Interop
- P/Invoke
Since libvlc is a native library which exports pure C methods, P/Invoke is chosen here.
If you are planning to enrich your knowledge of P/Invoke, libvlc is a great place to start. It has a large number of structures, unions, and callback functions, and some methods require custom marshalling to handle double pointers and string conversions.
We have to download the VLC source code to better understand the libvlc interface. Please follow thislink. After extracting the archive content, go to the
These are the header files for libvlc. In case you want to use them directly in a native (C/C++) application, there is an excellentarticle explaining that.
Custom marshalling
The entry point to the libvlc interface is the libvlc_new
API defined inlibvlc.h:
VLC_PUBLIC_API libvlc_instance_t * libvlc_new( int argc , const char *const *argv );
argv
is the double pointer to a set of strings which controls the behavior of the VLC engine, like disabling screensaver, not using any implemented UIs, and so on.
The managed method declaration uses custom marshalling attributes to instruct the .NET runtime how to pass an array ofSystem.String
objects to the expected native format:
[DllImport("libvlc")] public static extern IntPtr libvlc_new(int argc, [MarshalAs(UnmanagedType.LPArray, ArraySubType = UnmanagedType.LPStr)] string[] argv);
Note that the return type of the method is an IntPtr
which holds a reference to the native pointer to thelibvlc_instance_t
structure.
Structures
Here is the libvlc_log_message_t
definition taken from libvlc_structures.h:
typedef struct libvlc_log_message_t { unsigned sizeof_msg; /* sizeof() of message structure, must be filled in by user */ int i_severity; /* 0=INFO, 1=ERR, 2=WARN, 3=DBG */ const char *psz_type; /* module type */ const char *psz_name; /* module name */ const char *psz_header; /* optional header */ const char *psz_message; /* message */ } libvlc_log_message_t;
The managed analog of this structure is pretty straightforward:
[StructLayout(LayoutKind.Sequential)] public struct libvlc_log_message_t { public UInt32 sizeof_msg; public Int32 i_severity; public IntPtr psz_type; public IntPtr psz_name; public IntPtr psz_header; public IntPtr psz_message; }
LayoutKind.Sequential
means that all the members of the structure are laid out sequentially in the native memory.
Unions
Unions are similar to structures, but their members declared by type definition begins at the same memory location. This means that the layout must be controlled explicitly by marshalling the runtime, and this is achieved using theFieldOffset
attribute.
Here is the libvlc_event_t
definition from libvlc_events.h:
typedef struct libvlc_event_t { int type; void *p_obj union { /* media descriptor */ struct { libvlc_meta_t meta_type; } media_meta_changed; struct { libvlc_media_t * new_child; } media_subitem_added; struct { int64_t new_duration; } media_duration_changed; … } }
It is basically a structure which has two simple members and a union. LayoutKind.Explicit
is used to tell the runtime the exact location in memory for each field:
[StructLayout(LayoutKind.Explicit)] public struct libvlc_event_t { [FieldOffset(0)] public libvlc_event_e type; [FieldOffset(4)] public IntPtr p_obj; [FieldOffset(8)] public media_player_time_changed media_player_time_changed; } [StructLayout(LayoutKind.Sequential)] public struct media_player_time_changed { public long new_time; }
If you intent to extend the libvlc_event_t
definition with additional values, they must all be decorated with the[FieldOffset(8)]
attribute since all of them begin at an offset of 8 bytes.
Callback functions
When the underlying VLC engine has its internal state changed, it uses callback functions to notify whoever subscribed for this kind of change. Subscriptions are made using thelibvlc_event_attach
API defined in libvlc.h. The API has four parameters:
- Pointer to the event manager object.
libvlc_event_type_t
enum value specifying the event on which callbacks are required.- Pointer to the
libvlc_callback_t
function. - Optional: additional user data.
The callback function pointer is declared in libvlc.h as follows:
typedef void ( *libvlc_callback_t )( const struct libvlc_event_t *, void * );
It accepts a pointer to the libvlc_event_t
structure and optional user defined data.
The managed port is a delegate with the same signature:
[UnmanagedFunctionPointer(CallingConvention.Cdecl)] private delegate void VlcEventHandlerDelegate( ref libvlc_event_t libvlc_event, IntPtr userData);
Please note that I want to get a reference to the libvlc_event_t
structure to access its parameters in theMediaPlayerEventOccured
function. Unlike other places where I simply use anIntPtr
to pass the pointer among method calls.
public EventBroker(IntPtr hMediaPlayer) { VlcEventHandlerDelegate callback1 = MediaPlayerEventOccured; m_hEventMngr = LibVlcMethods.libvlc_media_player_event_manager(hMediaPlayer); hCallback1 = Marshal.GetFunctionPointerForDelegate(callback1); m_callbacks.Add(callback1); GC.KeepAlive(callback1); } private void MediaPlayerEventOccured(ref libvlc_event_t libvlc_event, IntPtr userData) { switch (libvlc_event.type) { case libvlc_event_e.libvlc_MediaPlayerTimeChanged: RaiseTimeChanged(libvlc_event.media_player_time_changed.new_time); break; case libvlc_event_e.libvlc_MediaPlayerEndReached: RaiseMediaEnded(); break; } }
.NET delegate types are managed versions of C callback functions, therefore theSystem.Runtime.InteropServices.Marshal
class contains conversion routines to convert delegates to and from native method calls. After the delegate definition is marshaled to a native function pointer callable from native code, we have to maintain a reference for the managed delegate to prevent it from being deallocated by the GC, since native pointers cannot “hold” a reference to a managed resource.
nVLC API
IMediaPlayerFactory
- Wraps the libvlc_instance_t
handle and is used to create media objects and media player objects.
IPlayer
- holds alibvlc_media_player_t
handle and is used for basic playout when no audio or video output is needed, for example, streaming or transcoding of media.IAuidoPlayer
– ExtendsIPlayer
and is used to play and/or stream audio media.IVideoPlayer
– ExtentsIAudioPlayer
and is used to render and/or stream audio and video media.IEventBroker
– Encapsulates events raised by the VLC engine by wrapping thelibvlc_event_manager_t
handle.IMedia
– Wraps thelibvlc_media_t
handle and lets the user to add media options.
The implementation of these interfaces is shown below:
Memory management
Since each wrapper object holds a reference to native memory, we have to make sure this memory is released when the managed object is reclaimed by the garbage collector. This is done by implicitly or explicitly calling theDispose
method by user code, or by the finalizer when object is deallocated. I wrapped this functionality in theDisposableBase
class:
public abstract class DisposableBase : IDisposable { private bool m_isDisposed; public void Dispose() { if (!m_isDisposed) { Dispose(true); GC.SuppressFinalize(this); m_isDisposed = true; } } protected abstract void Dispose(bool disposing); // if (disposing) // { // // get rid of managed resources // } // // get rid of unmanaged resources ~DisposableBase() { if (!m_isDisposed) { Dispose(false); m_isDisposed = true; } } protected void VerifyObjectNotDisposed() { if (m_isDisposed) { throw new ObjectDisposedException(this.GetType().Name); } } }
Each class that inherits from DisposableBase
must implement the Dispose
method which will be called with a parameter true
when invoked by user code, and both managed and unmanaged resources may be released here, or with a parameterfalse
, which means it in invoked by the finalizer and only native resources may be released.
Logging
VLC implements logging logic in the form of a log iterator, so I decided to implement it also using the Iterator pattern, i.e., using ayield return
statement:
public IEnumeratorGetEnumerator() { IntPtr i = LibVlcMethods.libvlc_log_get_iterator(m_hLog); while (LibVlcMethods.libvlc_log_iterator_has_next(i) != 0) { libvlc_log_message_t msg = new libvlc_log_message_t(); msg.sizeof_msg = (uint)Marshal.SizeOf(msg); LibVlcMethods.libvlc_log_iterator_next(i, ref msg); yield return GetMessage(msg); } LibVlcMethods.libvlc_log_iterator_free(i); LibVlcMethods.libvlc_log_clear(m_hLog); } private LogMessage GetMessage(libvlc_log_message_t msg) { StringBuilder sb = new StringBuilder(); sb.AppendFormat("{0} ", Marshal.PtrToStringAnsi(msg.psz_header)); sb.AppendFormat("{0} ", Marshal.PtrToStringAnsi(msg.psz_message)); sb.AppendFormat("{0} ", Marshal.PtrToStringAnsi(msg.psz_name)); sb.Append(Marshal.PtrToStringAnsi(msg.psz_type)); return new LogMessage() { Message = sb.ToString(), Severity = (libvlc_log_messate_t_severity)msg.i_severity }; }
This code is called for each timeout (default is 1 sec), iterates over all existing log messages, and cleans up the log. The actual writing to the log file (or any other target) is implemented usingNLog, and you should add a custom configuration section to yourapp.config for this to work:
<configSections> <section name="nlog" type="NLog.Config.ConfigSectionHandler, NLog" /> </configSections> <nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <targets> <target name="file" xsi:type="File" layout="${longdate} ${level} ${message}" fileName="${basedir}/logs/logfile.txt" keepFileOpen="false" encoding="iso-8859-2" /> </targets> <rules> <logger name="*" minlevel="Debug" writeTo="file" /> </rules> </nlog>
Using the code
Before running any application using nVLC, you have to download the latest VLC 1.1.x, or a higher version fromhere. After running the installer, go toC:\Program Files\VideoLAN\VLC and copy the following items to your executable path:
- libvlc.dll
- libvlccode.dll
- plugins directory
If any of these is missing at runtime, you will have a DllNotFoundException
thrown.
In your code, add a reference to the Declarations and Implementation projects. The first instance you have to construct is theMediaPlayerFactory
from which you can construct a media object by calling theCreateMedia
function and media player objects by calling the CreatePlayer
function.
Playback files
The most basic usage would be a file playback to a specified panel:
IMediaPlayerFactory factory = new MediaPlayerFactory(); IMedia media = factory.CreateMedia(@"C:\Videos\Movie.wmv"); IVideoPlayer player = factory.CreatePlayer (); player.WindowHandle = panel1.Handle; player.Open(media); player.Events.MediaEnded += new EventHandler(Events_MediaEnded); player.Events.TimeChanged += new EventHandler (Events_TimeChanged); player.Play();
Playback DirectShow
VLC has built-in support for DirectShow capture source filters; that means that if you have a web cam or video acquisition card that has a DirectShow filter, it can be used seamlessly by using the libvlc API.
IMedia media = factory.CreateMedia(@"dshow://", @"dshow-vdev=Trust Webcam 15007");
Note that the media path is always set to dshow:// and the actual video device is specified by the option parameter.
Playback network stream
VLC supports a wide range of network protocols like UDP, RTP, HTTP, and others. By specifying a media path with a protocol name, IP address, and port, you can capture the stream and render it the same way as opening a local media file:
IMedia media = factory.CreateMedia(@"udp://@172.16.10.1:19005");
Streaming
Beyond impressive playback capabilities, VLC also acts as a no less impressive streaming engine. Before we jump into the implementation details, I will shortly describe the streaming capabilities of the VLC Media Player.
After running VLC, go to Media -> Streaming, the "Open media" dialog is opened, and specify the media you desire to broadcast over the network:
As shown above, you can stream a local file, disk, network stream, or capture device. In this case, I choose a local file and pressed "Stream", and on the next tab, "Next":
Now you can choose the destination of the previously selected stream. If the "File" option is selected and "Activate Transcoding" is checked, you are simply transcoding (or remultiplexing) the media to a different format. For the sake of simplicity, I chose UDP, pressed "Add", and then specified 127.0.0.1:9000, which means I want to stream the media locally on my machine to port 9000.
Make sure "Activate Transcoding" is checked, and press the "Edit Profile" button:
This dialog lets you choose the encapsulation, which is a media container format, a video codec, and an audio codec. The number of possibilities here is huge, and note that not every video and audio format is compatible with each container, but again, for the sake of simplicity, I chose to use the MP4 container with an h264 video encoder and an AAC audio encoder. After pressing "Next", you will have the final dialog with the "Generated stream output string".
This is the most important part as this string should be passed to the media object so you can simply copy it and use it in the API as follows:
string output = @":sout=#transcode{vcodec=h264,vb=0,scale=0,acodec=mp4a,"+ @"ab=128,channels=2,samplerate=44100}:udp{dst=127.0.0.1:9000} "; IMedia media = factory.CreateMedia(@"C:\Videos\Movie.wmv", output); IPlayer player = factory.CreatePlayer (); player.Open(media); player.Play();
This will open the selected movie file, transcode it to the desired format, and stream it over UDP.
Memory renderer
Normally, you would render your video on screen, passing some window handle on which the actual frames are displayed according to the media clock. LibVLC also allows you to render raw video (pixel data) to a pre-allocated memory buffer. This functionality is implemented by the libvlc_video_set_callbacks
and libvlc_video_set_format
APIs.IVideoPlayer
has a property called CustomRenderer
of typeIMediaRenderer
which wraps these two APIs.
/// <summary> /// Enables custom processing of video frames. /// </summary> public interface IMemoryRenderer { /// <summary> /// Sets the callback which invoked when new frame should be displayed /// </summary> /// <param name="callback">Callback method</param> /// <remarks>The frame will be auto-disposed after callback invokation.</remarks> void SetCallback(NewFrameEventHandler callback); /// <summary> /// Gets the latest video frame that was displayed. /// </summary> Bitmap CurrentFrame { get; } /// <summary> /// Sets the bitmap format for the callback. /// </summary> /// <param name="format">Bitmap format of the video frame</param> void SetFormat(BitmapFormat format); /// <summary> /// Gets the actual frame rate of the rendering. /// </summary> int ActualFrameRate { get; } }
You have two options for frame processing:
- Callback
By calling the
SetCallback
method, your callback will be invoked when a new frame is ready to be displayed. TheSystem.Drawing.Bitmap
object passed to the callback method is valid only inside a callback; afterwards it is disposed, so you have to clone it if you plan to use it elsewhere. Also note that the callback code must be extremely efficient; otherwise, the playback will be delayed and frames may be dropped. For instance, if you are rendering a 30 frames per second video, you have a time slot of approximately 33 ms between frames. You can test for performance degradation by comparing the values ofIVideoPlayer.FPS
and theIMemoryRenderer.ActualFrameRate
. The following code snippet demonstrates rendering of 4CIF frames in RGB24 format:Collapse | Copy CodeIMediaPlayerFactory factory = new MediaPlayerFactory(); IVideoPlayer player = player = factory.CreatePlayer
(); IMedia media = factory.CreateMedia (@"C:\MyVideoFile.avi"); IMemoryRenderer memRender = player.CustomRenderer; memRender.SetCallback(delegate(Bitmap frame) { // Do something with the bitmap }); memRender.SetFormat(new BitmapFormat(704, 576, ChromaType.RV24)); player.Open(media); player.Play(); - Get frame
If you want to query for frames at your own pace, you should use the
CurrentFrame
property. It will return the latest frame that was scheduled for display. It is your own responsibility to free its resources after you are done with it.Collapse | Copy CodeIMediaPlayerFactory factory = new MediaPlayerFactory(); IVideoPlayer player = player = factory.CreatePlayer
(); IMedia media = factory.CreateMedia (@"C:\MyVideoFile.avi"); IMemoryRenderer memRender = player.CustomRenderer; memRender.SetFormat(new BitmapFormat(704, 576, ChromaType.RV24)); player.Open(media); player.Play(); private void OnTimer(IMemoryRenderer memRender) { Bitmap bmp = memRender.CurrentFrame; // Do something with the bitmap bmp.Dispose(); } The
SetFormat
method accepts aBitmapFormat
object which encapsulates the frame size and pixel format. Bytes per pixel, size of the frame, and pitch (or stride) are calculated internally according to theChromaType
value.The
IVideoPlayer
may operate either in on-screen rendering mode or memory rendering mode. Once you set it to memory rendering mode by calling theCustomRenderer
property, you will not see any video on screen.
Advanced memory renderer
Starting with libVLC 1.2.0, it is possible to use the VLC engine to output decoded audio and visual data for custom processing, i.e., input any kind of encoded and multiplexed media and output as decoded video frames and audio samples. The format of audio and video samples can be set before playback starts, as well as video size, pixel alignment, audio format, number of channels, and more. When playback starts, the appropriate callback function will be invoked for each video frame upon its display time and for a given number of audio samples by their playback time. This gives you, as a developer, great flexibility since you can apply different image and sound processing algorithms and, if needed, eventually render the audio visual data.
libVLC exposes this advanced functionality through the libvlc_video_set_***
andlibvlc_audio_set_***
set of APIs. In the nVLC project, video functionality is exposed though theICustomRendererEx
interface:
/// <summary> /// Contains methods for setting custom processing of video frames. /// </summary> public interface IMemoryRendererEx { /// <summary> /// Sets the callback which invoked when new frame should be displayed /// </summary> /// <param name="callback">Callback method</param> void SetCallback(NewFrameDataEventHandler callback); /// <summary> /// Gets the latest video frame that was displayed. /// </summary> PlanarFrame CurrentFrame { get; } /// <summary> /// Sets the callback invoked before the media playback starts /// to set the desired frame format. /// </summary> /// <param name="setupCallback"></param> /// <remarks>If not set, original media format will be used</remarks> void SetFormatSetupCallback(FuncsetupCallback); /// <summary> /// Gets the actual frame rate of the rendering. /// </summary> int ActualFrameRate { get; } }
and audio samples can be accessed through the CustomAudioRenderer
property of theIAduioPlayer
object:
/// <summary> /// Enables custom processing of audio samples /// </summary> public interface IAudioRenderer { /// <summary> /// Sets callback methods for volume change and audio samples playback /// </summary> /// <param name="volume">Callback method invoked /// when volume changed or muted</param> /// <param name="sound">Callback method invoked when /// new audio samples should be played</param> void SetCallbacks(VolumeChangedEventHandler volume, NewSoundEventHandler sound); /// <summary> /// Sets audio format /// </summary> /// <param name="format"></param> /// <remarks>Mutually exclusive with SetFormatCallback</remarks> void SetFormat(SoundFormat format); /// <summary> /// Sets audio format callback, to get/set format before playback starts /// </summary> /// <param name="formatSetup"></param> /// <remarks>Mutually exclusive with SetFormat</remarks> void SetFormatCallback(FuncformatSetup); }
To make the task of rendering video samples and playing audio samples easier, I developed a small library calledTaygeta. It started as a testing application for the nVLC features, but since I liked it so much :) I decided to convert it to a standalone project. It uses Direct3D for hardware accelerated video rendering, and XAudio2 for audio playback. It also contains a sample application with all the previously described functionality.
Memory input
As explained in previous sections, VLC provides many access modules foryour media. When any of those satisfies your requirements, and you need, forexample to capture a window contents or stream 3D scene to another machine,memory input will do the work as it provides interface for streaming media froma memory buffer. libVLC contains 2 modules for memory input:invmem and imem. The problem is that both of them not exposed by the libVLC API and one has toput some real effort to make them work, especially from managed code.
Invmem was deprecated in libVLC 1.2 so I will not describe it here. Itis exposed viaIVideoInputMedia
object and you can search the "Commentsand Discussions" forum for usage examples.
Imem, on the other hand, is still supported and exposed byIMemoryInputMedia
object:
/// <summary> /// Enables elementary stream (audio, video, subtitles or data) frames insertion into VLC engine (based on imem access module) /// </summary> public interface IMemoryInputMedia : IMedia { /// <summary> /// Initializes instance of the media object with stream information and frames' queue size /// </summary> /// <param name="streamInfo"></param> /// <param name="maxFramesInQueue">Maximum items in the queue. If the queue is full any AddFrame overload /// will block until queue slot becomes available</param> void Initialize(StreamInfo streamInfo, int maxItemsInQueue = 30); /// <summary> /// Add frame of elementary stream data from memory on native heap /// </summary> /// <param name="streamInfo"></param> /// <remarks>This function copies frame data to internal buffer, so native memory may be safely freed</remarks> void AddFrame(FrameData frame); /// <summary> /// Add frame of elementary stream data from memory on managed heap /// </summary> /// <param name="data"></param> /// <param name="pts">Presentation time stamp</param> /// <param name="dts">Decoding time stamp. -1 for unknown</param> /// <remarks>Time origin for both pts and dts is 0</remarks> void AddFrame(byte[] data, long pts, long dts = -1); /// <summary> /// Add frame of video stream from System.Drawing.Bitmap object /// </summary> /// <param name="bitmap"></param> /// <param name="pts">Presentation time stamp</param> /// <param name="dts">Decoding time stamp. -1 for unknown</param> /// <remarks>Time origin for both pts and dts is 0</remarks> /// <remarks>This function copies bitmap data to internal buffer, so bitmap may be safely disposed</remarks> void AddFrame(Bitmap bitmap, long pts, long dts = -1); /// <summary> /// Sets handler for exceptions thrown by background threads /// </summary> /// <param name="handler"></param> void SetExceptionHandler(Actionhandler); /// <summary> /// Gets number of pending frames in queue /// </summary> int PendingFramesCount { get; } }
The interface provides 3 AddFrame overloads which take frame data frompointer on native heap, managed byte array or Bitmap object. Each method copiesthe data to internal structure and stores it in frame queue. Therefore, aftercalling AddFrame you can release frame resources. Once you initialize the IMemoryInputMedia
and call play on the media player object, VLC launches playback thread whichruns infinite loop. Inside the loop it fetches a frame of data and pushes themas quick as possible to the downstream modules.
To support this paradigm I created producer/consumer queue to holdmedia frames. The queue isBlockingCollectionwhich perfectly suits the needs of this module: it blocks the producer threadif the queue is full and blocks the consumer thread when queue is empty. Thequeue size default is 30 so it caches approximately 1 second of video. Thiscache allows smooth video playback. Take into account that increasing the queuesize will impact on your memory usage – 1 frame of HD video (1920x 1080) atBGR24 occupies 5.93 MB. If you have frame rate control over your media source,you can periodically check for number of pending frames in queue and increaseor decrease the rate.
DTS and PTS value used to notify libVLC engine when the frame should behandled by the decoder – decoding time stamp, and when the frame should bepresented by the renderer – presentation time stamp. The default value for DTSis -1 which means don't use it and use only the PTS. This is useful when usingraw video frames like BGR24 or I420 which go directly to rendering so no needfor decoding. PTS are a must value and if you don't have it along with yourmedia frames they can be easily calculated by using FPS of your media sourceand a frame counting number:
long frameNumber = 0; long frameIntervalInMicroSeconds = 1000000 / FrameRate; long PTS = ++frameNumber * frameIntervalInMicroSeconds;
This will give the value of PTS in microseconds for the value of therendered frame.
Using the code is the same as any other media instance:
StreamInfo fInfo = new StreamInfo(); fInfo.Category = StreamCategory.Video; fInfo.Codec = VideoCodecs.BGR24; fInfo.FPS = FrameRate; fInfo.Width = Width; fInfo.Height = Height; IMemoryInputMedia m_iMem = m_factory.CreateMedia(MediaStrings.IMEM); m_iMem.Initialize(fInfo); m_player.Open(m_iMem); m_player.Play(); ... private void OnYourMediaSourceCallback(MediaFrame frame) { ar fdata = new FrameData() { Data = frame.Data, DataSize = frame.DataSize, DTS = -1, PTS = frame.PTS }; m_iMem.AddFrame(fdata); frame.Dispose(); }
Don't forget to dispose the media object when you are done with it, asit also releases memory of all pending frames.
References
- http://www.videolan.org/doc/streaming-howto/en/
- http://www.pinvoke.net
History
- 14.9.2010
- Initial release.
- 27.9.2010
- Fixed the
MediaEnded
event to be invoked on the ThreadPool thread (issue reported by debesta). - Implemented missing
Volume
andMute
properties in theIAudioPlayer
object. - Added implementation for
MediaList
andMediaListPlayer
functionality.
- Fixed the
- 22.10.2010
- Added Unicode support.
- Fixed the
TakeSnapShot
method (issue reported by Member 7477754). - Extended audio and video player functionality.
- Added
IDiskPlayer
for DVD, VCD, and Audio CD playback. - Added
IMemoryRenderer
for custom video rendering (libvlc_video_set_callbacks
andlibvlc_video_set_format
). - Added video filters (crop, deinterlace, and adjust) and overlay filters (logo and marquee).
- Added CHM documentation.
- 18.11.2010
- Added
IVideoInputMedia
for using invmem access module for frame by frame video input. - Added
IScreenCaptureMedia
for capturing entire screen or part of it. - Fixed
libvlc_media_get_tracks_info
implementation (issues reported by Member 2090855). - Extended async events functionality for
IMedia
,IPlayer
,IMediaList
, andIMediaListPlayer
objects. - Extended
IMedia
functionality (some members moved to theIMediaFromFile
interface). - Added sample application for Windows Forms and WPF.
- Added
- 19.4.2011
- Fixed WPF sample and deinterlace filter.
- Added DVD navigation API (libvlc 1.2.0 or above).
- Solution upgraded to VS 2010 and .NET Framework 4.0.
- 6.7.2011
- Changed P/Invoke signatures to prevent the
PInvokeStackImbalance
exception (Thanks to PABnet). - Added VLM (video LAN Manager) implementation (Thanks to Mulltonne).
- Added support for libvlc 1.2.0 including filter enumeration and
IMemoryRenderEx
with YUV420 and YUV422 support. - Added memory audio renderer.
- Changed P/Invoke signatures to prevent the
- 25.10.2011
- Bug fixes and minor changes.
- 10.10.2012
- Added audio output module and audio output device selection
- Added auto discovery of libVLC dlls (thanks to Raben)
- Added WPF sample based on D3DImage for better WPF integration
- Added support for x64 platform libvlc 2.0.1 or higher (thanks to John O'Halloran)
- Added imem module support for video/audio memory input
- Added support for J420 (MJPEG) chroma type within
IMemoryRenderEx
- Updated license for libVLC 2.0.0 to LGPLv2