Windows Media Foundation 音視訊採集
轉自:https://www.cnblogs.com/rhzhang/p/5185686.html
寫在前面
我是個講文明的人…… 不過有的時候實在忍不住了也要吐槽幾句:
1. 我真是跟不上時代,到現在了還在研究 Windows 應用開發…… 咳;
2. DirectShow 是傻X!我只是想要獲取 Camera 裸資料,尼瑪卻要讓我學習神馬各種 .ax, filter, graph... 相關資料少、又晦澀;
3. 在此祝願 Windows XP 及其之前的版本早點退出歷史舞臺,這樣 DirectShow 就不是必須的了!
音視訊採集
列舉裝置並設定裝置
通過 Source Reader 讀取媒體
相關文件:
https://msdn.microsoft.com/en-us/library/dd743690.aspx
https://msdn.microsoft.com/en-us/library/dd317912.aspx
https://msdn.microsoft.com/en-us/library/dd940326.aspx
https://msdn.microsoft.com/en-us/library/dd940328.aspx (裝置使用時中途丟失的處理,待學)
https://msdn.microsoft.com/en-us/library/ee663602.aspx
https://msdn.microsoft.com/en-us/library/aa473818.aspx (媒體型別)
其他功能
音視訊採集 + 編碼(Transcode): https://msdn.microsoft.com/en-us/library/ff485863.aspx
播放媒體檔案:https://msdn.microsoft.com/en-us/library/ms703190.aspx
程式碼地址
在 Windows SDK 的 Samples\multimedia\mediafoundation 目錄中
Enumerating Video Capture Devices
This topic describes how to enumerate the video capture devices on the user's system, and how create an instance of a device.
To enumerate the video capture devices on the system, do the following:
- Call MFCreateAttributes to create an attribute store. This function receives anIMFAttributes pointer.
- Call IMFAttributes::SetGUID to set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute. Set the attribute value to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID.
- Call MFEnumDeviceSources. This function receives an array ofIMFActivate pointers and the array size. Each pointer represents a distinct video capture device.
To create an instance of a capture device:
- Call IMFActivate::ActivateObject to get a pointer to theIMFMediaSource interface.
The following code shows these steps:
HRESULT CreateVideoDeviceSource(IMFMediaSource **ppSource) { *ppSource = NULL; IMFMediaSource *pSource = NULL; IMFAttributes *pAttributes = NULL; IMFActivate **ppDevices = NULL; // Create an attribute store to specify the enumeration parameters. HRESULT hr = MFCreateAttributes(&pAttributes, 1); if (FAILED(hr)) { goto done; } // Source type: video capture devices hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); if (FAILED(hr)) { goto done; } // Enumerate devices. UINT32 count; hr = MFEnumDeviceSources(pAttributes, &ppDevices, &count); if (FAILED(hr)) { goto done; } if (count == 0) { hr = E_FAIL; goto done; } // Create the media source object. hr = ppDevices[0]->ActivateObject(IID_PPV_ARGS(&pSource)); if (FAILED(hr)) { goto done; } *ppSource = pSource; (*ppSource)->AddRef(); done: SafeRelease(&pAttributes); for (DWORD i = 0; i < count; i++) { SafeRelease(&ppDevices[i]); } CoTaskMemFree(ppDevices); SafeRelease(&pSource); return hr; }
After you create media source, release the interface pointers and free the memory for the array:
SafeRelease(&pAttributes);
for (DWORD i = 0; i < count; i++)
{
SafeRelease(&ppDevices[i]);
}
CoTaskMemFree(ppDevices);
Related topics
Audio/Video Capture in Media Foundation
Microsoft Media Foundation supports audio and video capture. Video capture devices are supported through the UVC class driver and must be compatible with UVC 1.1. Audio capture devices are supported through Windows Audio Session API (WASAPI).
A capture device is represented in Media Foundation by a media source object, which exposes the IMFMediaSource interface. In most cases, the application will not use this interface directly, but will use a higher-level API such as the Source Reader to control the capture device.
Enumerate Capture Devices
To enumerate the capture devices on the system, perform the following steps:
- Call the MFCreateAttributes function to create an attribute store.
- Set the
MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to one of the following values:
Value Description MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID Enumerate audio capture devices. MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID Enumerate video capture devices. - Call the MFEnumDeviceSources function. This function allocates an array of IMFActivate pointers. Each pointer represents an activation object for one device on the system.
- Call the IMFActivate::ActivateObject method to create an instance of the media source from one of the activation objects.
The following example creates a media source for the first video capture device in the enumeration list:
HRESULT CreateVideoCaptureDevice(IMFMediaSource **ppSource) { *ppSource = NULL; UINT32 count = 0; IMFAttributes *pConfig = NULL; IMFActivate **ppDevices = NULL; // Create an attribute store to hold the search criteria. HRESULT hr = MFCreateAttributes(&pConfig, 1); // Request video capture devices. if (SUCCEEDED(hr)) { hr = pConfig->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); } // Enumerate the devices, if (SUCCEEDED(hr)) { hr = MFEnumDeviceSources(pConfig, &ppDevices, &count); } // Create a media source for the first device in the list. if (SUCCEEDED(hr)) { if (count > 0) { hr = ppDevices[0]->ActivateObject(IID_PPV_ARGS(ppSource)); } else { hr = MF_E_NOT_FOUND; } } for (DWORD i = 0; i < count; i++) { ppDevices[i]->Release(); } CoTaskMemFree(ppDevices); return hr; }
You can query the activation objects for various attributes, including the following:
- The MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME attribute contains the display name of the device. The display name is suitable for showing to the user, but might not be unique.
- For video devices, the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute contains the symbolic link to the device. The symbolic link uniquely identifies the device on the system, but is not a readable string.
- For audio devices, the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID attribute contains the audio endpoint ID of the device. The audio endpoint ID is similar to a symbolic link. It uniquely identifies the device on the system, but is not a readable string.
The following example takes an array of IMFActivate pointers and prints the display name of each device to the debug window:
void DebugShowDeviceNames(IMFActivate **ppDevices, UINT count) { for (DWORD i = 0; i < count; i++) { HRESULT hr = S_OK; WCHAR *szFriendlyName = NULL; // Try to get the display name. UINT32 cchName; hr = ppDevices[i]->GetAllocatedString( MF_DEVSOURCE_ATTRIBUTE_FRIENDLY_NAME, &szFriendlyName, &cchName); if (SUCCEEDED(hr)) { OutputDebugString(szFriendlyName); OutputDebugString(L"\n"); } CoTaskMemFree(szFriendlyName); } }
If you already know the symbolic link for a video device, there is another way to create the media source for the device:
- Call MFCreateAttributes to create an attribute store.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute to the symbolic link.
- Call either the MFCreateDeviceSource or MFCreateDeviceSourceActivate function. The former returns an IMFMediaSource pointer. The latter returns an IMFActivate pointer to an activation object. You can use the activation object to create the source. (An activation object can be marshaled to another process, so it is useful if you want to create the source in another process. For more information, see Activation Objects.)
The following example takes the symbolic link of a video device and creates a media source.
HRESULT CreateVideoCaptureDevice(PCWSTR *pszSymbolicLink, IMFMediaSource **ppSource) { *ppSource = NULL; IMFAttributes *pAttributes = NULL; IMFMediaSource *pSource = NULL; HRESULT hr = MFCreateAttributes(&pAttributes, 2); // Set the device type to video. if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID ); } // Set the symbolic link. if (SUCCEEDED(hr)) { hr = pAttributes->SetString( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK, (LPCWSTR)pszSymbolicLink ); } if (SUCCEEDED(hr)) { hr = MFCreateDeviceSource(pAttributes, ppSource); } SafeRelease(&pAttributes); return hr; }
There is an equivalent way to create an audio device from the audio endpoint ID:
- Call MFCreateAttributes to create an attribute store.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE attribute to MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID.
- Set the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID attribute to the endpoint ID.
- Call either the MFCreateDeviceSource or MFCreateDeviceSourceActivate function.
The following example takes an audio endpoint ID and creates a media source.
HRESULT CreateAudioCaptureDevice(PCWSTR *pszEndPointID, IMFMediaSource **ppSource) { *ppSource = NULL; IMFAttributes *pAttributes = NULL; IMFMediaSource *pSource = NULL; HRESULT hr = MFCreateAttributes(&pAttributes, 2); // Set the device type to audio. if (SUCCEEDED(hr)) { hr = pAttributes->SetGUID( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_GUID ); } // Set the endpoint ID. if (SUCCEEDED(hr)) { hr = pAttributes->SetString( MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_AUDCAP_ENDPOINT_ID, (LPCWSTR)pszEndPointID ); } if (SUCCEEDED(hr)) { hr = MFCreateDeviceSource(pAttributes, ppSource); } SafeRelease(&pAttributes); return hr; }
Use a capture device
After you create the media source for a capture device, use the Source Reader to get data from the device. The Source Reader delivers media samples that contain the capture audio data or video frames. The next step depends on your application scenario:
- Video preview: Use Microsoft Direct3D or Direct2D to display the video.
- File capture: Use the Sink Writer to encode the file.
- Audio preview: Use WASAPI.
If you want to combine audio capture with video capture, use the aggregate media source. The aggregate media source contains a collection of media sources and combines all of their streams into a single media source object. To create an instance of the aggregate media source, call the MFCreateAggregateSource function.
Shut down the capture device
When the capture device is no longer needed, you must shut down the device by calling Shutdown on the IMFMediaSource object you obtained by calling MFCreateDeviceSource or IMFActivate::ActivateObject. Failure to call Shutdown can result in memory links because the system may keep a reference to IMFMediaSource resources until Shutdown is called.
If you allocated a string containing the symbolic link to a capture device, you should release this object as well.
Related topics
Handling Video Device Loss
This topic describes how to detect device loss when using a video capture device. It contains the following sections:
- Register For Device Notification
- Get the Symbolic Link of the Device
- Handle WM_DEVICECHANGE
- Unregister For Notification
- Related topics
Register For Device Notification
Before you start capturing from the device, call the RegisterDeviceNotification function to register for device notifications. Register for the KSCATEGORY_CAPTURE device class, as shown in the following code.
#include <Dbt.h> #include <ks.h> #include <ksmedia.h> HDEVNOTIFY g_hdevnotify = NULL; BOOL RegisterForDeviceNotification(HWND hwnd) { DEV_BROADCAST_DEVICEINTERFACE di = { 0 }; di.dbcc_size = sizeof(di); di.dbcc_devicetype = DBT_DEVTYP_DEVICEINTERFACE; di.dbcc_classguid = KSCATEGORY_CAPTURE; g_hdevnotify = RegisterDeviceNotification( hwnd, &di, DEVICE_NOTIFY_WINDOW_HANDLE ); if (g_hdevnotify == NULL) { return FALSE; } return TRUE; }
Get the Symbolic Link of the Device
Enumerate the video devices on the system, as described in Enumerating Video Capture Devices. Choose a device from the list, and then query the activation object for the MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK attribute, as shown in the following code.
WCHAR *g_pwszSymbolicLink = NULL;
UINT32 g_cchSymbolicLink = 0;
HRESULT GetSymbolicLink(IMFActivate *pActivate)
{
return pActivate->GetAllocatedString(
MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_SYMBOLIC_LINK,
&g_pwszSymbolicLink,
&g_cchSymbolicLink
);
}
Handle WM_DEVICECHANGE
In your message loop, listen for WM_DEVICECHANGE messages. The lParam message parameter is a pointer to a DEV_BROADCAST_HDR structure.
case WM_DEVICECHANGE: if (lParam != 0) { HRESULT hr = S_OK; BOOL bDeviceLost = FALSE; hr = CheckDeviceLost((PDEV_BROADCAST_HDR)lParam, &bDeviceLost); if (FAILED(hr) || bDeviceLost) { CloseDevice(); MessageBox(hwnd, L"Lost the capture device.", NULL, MB_OK); } } return TRUE;
Next, compare the device notification message against the symbolic link of your device, as follows:
- Check the dbch_devicetype member of the DEV_BROADCAST_HDR structure. If the value is DBT_DEVTYP_DEVICEINTERFACE, cast the structure pointer to a DEV_BROADCAST_DEVICEINTERFACE structure.
- Compare the dbcc_name member of this structure to the symbolic link of the device.
HRESULT CheckDeviceLost(DEV_BROADCAST_HDR *pHdr, BOOL *pbDeviceLost) { DEV_BROADCAST_DEVICEINTERFACE *pDi = NULL; if (pbDeviceLost == NULL) { return E_POINTER; } *pbDeviceLost = FALSE; if (g_pSource == NULL) { return S_OK; } if (pHdr == NULL) { return S_OK; } if (pHdr->dbch_devicetype != DBT_DEVTYP_DEVICEINTERFACE) { return S_OK; } // Compare the device name with the symbolic link. pDi = (DEV_BROADCAST_DEVICEINTERFACE*)pHdr; if (g_pwszSymbolicLink) { if (_wcsicmp(g_pwszSymbolicLink, pDi->dbcc_name) == 0) { *pbDeviceLost = TRUE; } } return S_OK; }
Unregister For Notification
Before the application exits, call UnregisterDeviceNotification to unregister for device notifications/
void OnClose(HWND /*hwnd*/) { if (g_hdevnotify) { UnregisterDeviceNotification(g_hdevnotify); } PostQuitMessage(0); }
Related topics
Media Type Debugging Code
You can use the following code to view the contents of a media type while debugging.
// The following code enables you to view the contents of a media type while // debugging. #include <strsafe.h> LPCWSTR GetGUIDNameConst(const GUID& guid); HRESULT GetGUIDName(const GUID& guid, WCHAR **ppwsz); HRESULT LogAttributeValueByIndex(IMFAttributes *pAttr, DWORD index); HRESULT SpecialCaseAttributeValue(GUID guid, const PROPVARIANT& var); void DBGMSG(PCWSTR format, ...); HRESULT LogMediaType(IMFMediaType *pType) { UINT32 count = 0; HRESULT hr = pType->GetCount(&count); if (FAILED(hr)) { return hr; } if (count == 0) { DBGMSG(L"Empty media type.\n"); } for (UINT32 i = 0; i < count; i++) { hr = LogAttributeValueByIndex(pType, i); if (FAILED(hr)) { break; } } return hr; } HRESULT LogAttributeValueByIndex(IMFAttributes *pAttr, DWORD index) { WCHAR *pGuidName = NULL; WCHAR *pGuidValName = NULL; GUID guid = { 0 }; PROPVARIANT var; PropVariantInit(&var); HRESULT hr = pAttr->GetItemByIndex(index, &guid, &var); if (FAILED(hr)) { goto done; } hr = GetGUIDName(guid, &pGuidName); if (FAILED(hr)) { goto done; } DBGMSG(L"\t%s\t", pGuidName); hr = SpecialCaseAttributeValue(guid, var); if (FAILED(hr)) { goto done; } if (hr == S_FALSE) { switch (var.vt) { case VT_UI4: DBGMSG(L"%d", var.ulVal); break; case VT_UI8: DBGMSG(L"%I64d", var.uhVal); break; case VT_R8: DBGMSG(L"%f", var.dblVal); break; case VT_CLSID: hr = GetGUIDName(*var.puuid, &pGuidValName); if (SUCCEEDED(hr)) { DBGMSG(pGuidValName); } break; case VT_LPWSTR: DBGMSG(var.pwszVal); break; case VT_VECTOR | VT_UI1: DBGMSG(L"<<byte array>>"); break; case VT_UNKNOWN: DBGMSG(L"IUnknown"); break; default: DBGMSG(L"Unexpected attribute type (vt = %d)", var.vt); break; } } done: DBGMSG(L"\n"); CoTaskMemFree(pGuidName); CoTaskMemFree(pGuidValName); PropVariantClear(&var); return hr; } HRESULT GetGUIDName(const GUID& guid, WCHAR **ppwsz) { HRESULT hr = S_OK; WCHAR *pName = NULL; LPCWSTR pcwsz = GetGUIDNameConst(guid); if (pcwsz) { size_t cchLength = 0; hr = StringCchLength(pcwsz, STRSAFE_MAX_CCH, &cchLength); if (FAILED(hr)) { goto done; } pName = (WCHAR*)CoTaskMemAlloc((cchLength + 1) * sizeof(WCHAR)); if (pName == NULL) { hr = E_OUTOFMEMORY; goto done; } hr = StringCchCopy(pName, cchLength + 1, pcwsz); if (FAILED(hr)) { goto done; } } else { hr = StringFromCLSID(guid, &pName); } done: if (FAILED(hr)) { *ppwsz = NULL; CoTaskMemFree(pName); } else { *ppwsz = pName; } return hr; } void LogUINT32AsUINT64(const PROPVARIANT& var) { UINT32 uHigh = 0, uLow = 0; Unpack2UINT32AsUINT64(var.uhVal.QuadPart, &uHigh, &uLow); DBGMSG(L"%d x %d", uHigh, uLow); } float OffsetToFloat(const MFOffset& offset) { return offset.value + (static_cast<float>(offset.fract) / 65536.0f); } HRESULT LogVideoArea(const PROPVARIANT& var) { if (var.caub.cElems < sizeof(MFVideoArea)) { return MF_E_BUFFERTOOSMALL; } MFVideoArea *pArea = (MFVideoArea*)var.caub.pElems; DBGMSG(L"(%f,%f) (%d,%d)", OffsetToFloat(pArea->OffsetX), OffsetToFloat(pArea->OffsetY), pArea->Area.cx, pArea->Area.cy); return S_OK; } // Handle certain known special cases. HRESULT SpecialCaseAttributeValue(GUID guid, const PROPVARIANT& var) { if ((guid == MF_MT_FRAME_RATE) || (guid == MF_MT_FRAME_RATE_RANGE_MAX) || (guid == MF_MT_FRAME_RATE_RANGE_MIN) || (guid == MF_MT_FRAME_SIZE) || (guid == MF_MT_PIXEL_ASPECT_RATIO)) { // Attributes that contain two packed 32-bit values. LogUINT32AsUINT64(var); } else if ((guid == MF_MT_GEOMETRIC_APERTURE) || (guid == MF_MT_MINIMUM_DISPLAY_APERTURE) || (guid == MF_MT_PAN_SCAN_APERTURE)) { // Attributes that an MFVideoArea structure. return LogVideoArea(var); } else { return S_FALSE; } return S_OK; } void DBGMSG(PCWSTR format, ...) { va_list args; va_start(args, format); WCHAR msg[MAX_PATH]; if (SUCCEEDED(StringCbVPrintf(msg, sizeof(msg), format, args))) { OutputDebugString(msg); } } #ifndef IF_EQUAL_RETURN #define IF_EQUAL_RETURN(param, val) if(val == param) return L#val #endif LPCWSTR GetGUIDNameConst(const GUID& guid) { IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); IF_EQUAL_RETURN(guid, MF_MT_MAJOR_TYPE); IF_EQUAL_RETURN(guid, MF_MT_SUBTYPE); IF_EQUAL_RETURN(guid, MF_MT_ALL_SAMPLES_INDEPENDENT); IF_EQUAL_RETURN(guid, MF_MT_FIXED_SIZE_SAMPLES); IF_EQUAL_RETURN(guid, MF_MT_COMPRESSED); IF_EQUAL_RETURN(guid, MF_MT_SAMPLE_SIZE); IF_EQUAL_RETURN(guid, MF_MT_WRAPPED_TYPE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_NUM_CHANNELS); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FLOAT_SAMPLES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_AVG_BYTES_PER_SECOND); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BLOCK_ALIGNMENT); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_BITS_PER_SAMPLE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_VALID_BITS_PER_SAMPLE); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_SAMPLES_PER_BLOCK); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_CHANNEL_MASK); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_FOLDDOWN_MATRIX); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKREF); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_PEAKTARGET); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGREF); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_WMADRC_AVGTARGET); IF_EQUAL_RETURN(guid, MF_MT_AUDIO_PREFER_WAVEFORMATEX); IF_EQUAL_RETURN(guid, MF_MT_AAC_PAYLOAD_TYPE); IF_EQUAL_RETURN(guid, MF_MT_AAC_AUDIO_PROFILE_LEVEL_INDICATION); IF_EQUAL_RETURN(guid, MF_MT_FRAME_SIZE); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MAX); IF_EQUAL_RETURN(guid, MF_MT_FRAME_RATE_RANGE_MIN); IF_EQUAL_RETURN(guid, MF_MT_PIXEL_ASPECT_RATIO); IF_EQUAL_RETURN(guid, MF_MT_DRM_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_PAD_CONTROL_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_SOURCE_CONTENT_HINT); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_CHROMA_SITING); IF_EQUAL_RETURN(guid, MF_MT_INTERLACE_MODE); IF_EQUAL_RETURN(guid, MF_MT_TRANSFER_FUNCTION); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_PRIMARIES); IF_EQUAL_RETURN(guid, MF_MT_CUSTOM_VIDEO_PRIMARIES); IF_EQUAL_RETURN(guid, MF_MT_YUV_MATRIX); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_LIGHTING); IF_EQUAL_RETURN(guid, MF_MT_VIDEO_NOMINAL_RANGE); IF_EQUAL_RETURN(guid, MF_MT_GEOMETRIC_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_MINIMUM_DISPLAY_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_APERTURE); IF_EQUAL_RETURN(guid, MF_MT_PAN_SCAN_ENABLED); IF_EQUAL_RETURN(guid, MF_MT_AVG_BITRATE); IF_EQUAL_RETURN(guid, MF_MT_AVG_BIT_ERROR_RATE); IF_EQUAL_RETURN(guid, MF_MT_MAX_KEYFRAME_SPACING); IF_EQUAL_RETURN(guid, MF_MT_DEFAULT_STRIDE); IF_EQUAL_RETURN(guid, MF_MT_PALETTE); IF_EQUAL_RETURN(guid, MF_MT_USER_DATA); IF_EQUAL_RETURN(guid, MF_MT_AM_FORMAT_TYPE); IF_EQUAL_RETURN(guid, MF_MT_MPEG_START_TIME_CODE); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_PROFILE); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_LEVEL); IF_EQUAL_RETURN(guid, MF_MT_MPEG2_FLAGS); IF_EQUAL_RETURN(guid, MF_MT_MPEG_SEQUENCE_HEADER); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_0); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_0); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_SRC_PACK_1); IF_EQUAL_RETURN(guid, MF_MT_DV_AAUX_CTRL_PACK_1); IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_SRC_PACK); IF_EQUAL_RETURN(guid, MF_MT_DV_VAUX_CTRL_PACK); IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_HEADER); IF_EQUAL_RETURN(guid, MF_MT_ARBITRARY_FORMAT); IF_EQUAL_RETURN(guid, MF_MT_IMAGE_LOSS_TOLERANT); IF_EQUAL_RETURN(guid, MF_MT_MPEG4_SAMPLE_DESCRIPTION); IF_EQUAL_RETURN(guid, MF_MT_MPEG4_CURRENT_SAMPLE_ENTRY); IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_4CC); IF_EQUAL_RETURN(guid, MF_MT_ORIGINAL_WAVE_FORMAT_TAG); // Media types IF_EQUAL_RETURN(guid, MFMediaType_Audio); IF_EQUAL_RETURN(guid, MFMediaType_Video); IF_EQUAL_RETURN(guid, MFMediaType_Protected); IF_EQUAL_RETURN(guid, MFMediaType_SAMI); IF_EQUAL_RETURN(guid, MFMediaType_Script); IF_EQUAL_RETURN(guid, MFMediaType_Image); IF_EQUAL_RETURN(guid, MFMediaType_HTML); IF_EQUAL_RETURN(guid, MFMediaType_Binary); IF_EQUAL_RETURN(guid, MFMediaType_FileTransfer); IF_EQUAL_RETURN(guid, MFVideoFormat_AI44); // FCC('AI44') IF_EQUAL_RETURN(guid, MFVideoFormat_ARGB32); // D3DFMT_A8R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_AYUV); // FCC('AYUV') IF_EQUAL_RETURN(guid, MFVideoFormat_DV25); // FCC('dv25') IF_EQUAL_RETURN(guid, MFVideoFormat_DV50); // FCC('dv50') IF_EQUAL_RETURN(guid, MFVideoFormat_DVH1); // FCC('dvh1') IF_EQUAL_RETURN(guid, MFVideoFormat_DVSD); // FCC('dvsd') IF_EQUAL_RETURN(guid, MFVideoFormat_DVSL); // FCC('dvsl') IF_EQUAL_RETURN(guid, MFVideoFormat_H264); // FCC('H264') IF_EQUAL_RETURN(guid, MFVideoFormat_I420); // FCC('I420') IF_EQUAL_RETURN(guid, MFVideoFormat_IYUV); // FCC('IYUV') IF_EQUAL_RETURN(guid, MFVideoFormat_M4S2); // FCC('M4S2') IF_EQUAL_RETURN(guid, MFVideoFormat_MJPG); IF_EQUAL_RETURN(guid, MFVideoFormat_MP43); // FCC('MP43') IF_EQUAL_RETURN(guid, MFVideoFormat_MP4S); // FCC('MP4S') IF_EQUAL_RETURN(guid, MFVideoFormat_MP4V); // FCC('MP4V') IF_EQUAL_RETURN(guid, MFVideoFormat_MPG1); // FCC('MPG1') IF_EQUAL_RETURN(guid, MFVideoFormat_MSS1); // FCC('MSS1') IF_EQUAL_RETURN(guid, MFVideoFormat_MSS2); // FCC('MSS2') IF_EQUAL_RETURN(guid, MFVideoFormat_NV11); // FCC('NV11') IF_EQUAL_RETURN(guid, MFVideoFormat_NV12); // FCC('NV12') IF_EQUAL_RETURN(guid, MFVideoFormat_P010); // FCC('P010') IF_EQUAL_RETURN(guid, MFVideoFormat_P016); // FCC('P016') IF_EQUAL_RETURN(guid, MFVideoFormat_P210); // FCC('P210') IF_EQUAL_RETURN(guid, MFVideoFormat_P216); // FCC('P216') IF_EQUAL_RETURN(guid, MFVideoFormat_RGB24); // D3DFMT_R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB32); // D3DFMT_X8R8G8B8 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB555); // D3DFMT_X1R5G5B5 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB565); // D3DFMT_R5G6B5 IF_EQUAL_RETURN(guid, MFVideoFormat_RGB8); IF_EQUAL_RETURN(guid, MFVideoFormat_UYVY); // FCC('UYVY') IF_EQUAL_RETURN(guid, MFVideoFormat_v210); // FCC('v210') IF_EQUAL_RETURN(guid, MFVideoFormat_v410); // FCC('v410') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV1); // FCC('WMV1') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV2); // FCC('WMV2') IF_EQUAL_RETURN(guid, MFVideoFormat_WMV3); // FCC('WMV3') IF_EQUAL_RETURN(guid, MFVideoFormat_WVC1); // FCC('WVC1') IF_EQUAL_RETURN(guid, MFVideoFormat_Y210); // FCC('Y210') IF_EQUAL_RETURN(guid, MFVideoFormat_Y216); // FCC('Y216') IF_EQUAL_RETURN(guid, MFVideoFormat_Y410); // FCC('Y410') IF_EQUAL_RETURN(guid, MFVideoFormat_Y416); // FCC('Y416') IF_EQUAL_RETURN(guid, MFVideoFormat_Y41P); IF_EQUAL_RETURN(guid, MFVideoFormat_Y41T); IF_EQUAL_RETURN(guid, MFVideoFormat_YUY2); // FCC('YUY2') IF_EQUAL_RETURN(guid, MFVideoFormat_YV12); // FCC('YV12') IF_EQUAL_RETURN(guid, MFVideoFormat_YVYU); IF_EQUAL_RETURN(guid, MFAudioFormat_PCM); // WAVE_FORMAT_PCM IF_EQUAL_RETURN(guid, MFAudioFormat_Float); // WAVE_FORMAT_IEEE_FLOAT IF_EQUAL_RETURN(guid, MFAudioFormat_DTS); // WAVE_FORMAT_DTS IF_EQUAL_RETURN(guid, MFAudioFormat_Dolby_AC3_SPDIF); // WAVE_FORMAT_DOLBY_AC3_SPDIF IF_EQUAL_RETURN(guid, MFAudioFormat_DRM); // WAVE_FORMAT_DRM IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV8); // WAVE_FORMAT_WMAUDIO2 IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudioV9); // WAVE_FORMAT_WMAUDIO3 IF_EQUAL_RETURN(guid, MFAudioFormat_WMAudio_Lossless); // WAVE_FORMAT_WMAUDIO_LOSSLESS IF_EQUAL_RETURN(guid, MFAudioFormat_WMASPDIF); // WAVE_FORMAT_WMASPDIF IF_EQUAL_RETURN(guid, MFAudioFormat_MSP1); // WAVE_FORMAT_WMAVOICE9 IF_EQUAL_RETURN(guid, MFAudioFormat_MP3); // WAVE_FORMAT_MPEGLAYER3 IF_EQUAL_RETURN(guid, MFAudioFormat_MPEG); // WAVE_FORMAT_MPEG IF_EQUAL_RETURN(guid, MFAudioFormat_AAC); // WAVE_FORMAT_MPEG_HEAAC IF_EQUAL_RETURN(guid, MFAudioFormat_ADTS); // WAVE_FORMAT_MPEG_ADTS_AAC return NULL; }
相關文章
- 初識AV Foundation-視訊、影象採集
- Android 音視訊採集那些事Android
- .NET 音訊採集音訊
- 音視訊入門之音訊採集、編碼、播放音訊
- 抖音商家資訊採集器,抖音小店採集 電話採集
- iOS採集錄製音視訊API選擇推薦iOSAPI
- Android音視訊(一) Camera2 API採集資料AndroidAPI
- Audio Unit採集音訊實戰音訊
- 轉載:iOS音視訊實時採集硬體編碼iOS
- iOS視訊流採集概述(AVCaptureSession)iOSAPTSession
- iOS視訊採集實戰(AVCaptureSession)iOSAPTSession
- ffmpeg命令錄製windows音視訊Windows
- 利用爬蟲採集音訊資訊完整程式碼示例爬蟲音訊
- Audio Queue 採集音訊實戰(支援不同格式)音訊
- JavaCV FFmpeg採集麥克風PCM音訊資料Java音訊
- iOS 實時音訊採集與播放Audio Unit使用iOS音訊
- 招聘資訊採集
- 短視訊“音訊化”,音樂“視訊化”音訊
- 音視訊--音訊入門音訊
- 音視訊–音訊入門音訊
- 視訊採集:iOS平臺基於AVCaptureDevice的實現iOSAPTdev
- 天貓商品採集軟體,怎麼一鍵批量採集主圖、評論圖以及視訊
- 音視訊--視訊入門
- Android 音視訊 - MediaCodec 編解碼音視訊Android
- 亞馬遜的主圖視訊/描述視訊有辦法同採集下來嗎?亞馬遜
- 插入音/視訊
- 優惠券採集資訊
- Android 音視訊開發 視訊編碼,音訊編碼格式Android音訊
- 亞馬遜裡的商品視訊是怎麼批量採集的亞馬遜
- 能夠採集小紅書圖片、視訊的軟體,一鍵自動批量採集到電腦上
- 【資訊採集】IBM AIX系統硬體資訊檢視命令(shell指令碼)IBMAI指令碼
- IOS音視訊(二)AVFoundation視訊捕捉iOS
- android音視訊指南-管理音訊焦點Android音訊
- 抖店商家電話採集軟體 抖音小店店鋪電話批次採集工具
- 圖書網站資訊採集網站
- 工商資訊資料採集思路
- 人員基礎資訊採集
- FFmpeg音視訊同步
- 小程式音訊和視訊音訊