本篇內容基於百問網嵌入式Linux專案數碼相框與檔案瀏覽器和嵌入式Linux相機
目的:增加相機介面,支援實時播放和截圖功能。
框架說明
V4L2 簡介
V4L2 是 linux 作業系統下一套用於採集圖片、影片和音訊資料的通用 API 介面,配合適當的影片採集裝置和相應的驅動程式,可以實現圖片、影片、音訊等的採集
V4L2 驅動的攝像頭的裝置檔案一般是/dev/videoX(X 為任意數字,與具體的裝置相對應)
V4L2 影片採集步驟:
(1)開啟裝置,進行初始化引數設定,透過 V4L2 介面設定影片影像的採集視窗、採集的點陣大小和格式;
(2)申請影像幀緩衝,並進行記憶體對映,將這些幀緩衝區從核心空間對映到使用者空間,便於應用程式讀取、處理影像資料;
(3)將幀緩衝進行入隊操作,啟動影片採集;
(4)驅動開始影片資料的採集,應用程式從影片採集輸出佇列取出幀緩衝區,處理完後,將幀緩衝區重新放入影片採集輸入佇列,迴圈往復採集連續的影片資料;
(5)釋放資源,停止採集工作
本次專案中需要進行連續影片資料的採集,採用速度較快的記憶體對映方式(mmap)進行V4L2影像採集。
主要模組
顯示模組
- 格式轉換:從YUV/Mjpeg/RGB 轉換 LCD RGB格式
- LCD 顯示:統一的Frame BUffer控制,支援 VideoMem 影片頁面緩衝管理
影片模組
- V4L2 裝置管理:啟動和關閉,幀資料的採集。透過具體的 ioctl 命令完成。
struct VideoDevice {
int iFd;
int iPixelFormat;
int iWidth;
int iHeight;
int iVideoBufCnt;
int iVideoBufMaxLen;
int iVideoBufCurIndex;
unsigned char *pucVideBuf[NB_BUFFER];
/* 函式 */
PT_VideoOpr ptOPr;
};
struct VideoOpr {
char *name;
int (*InitDevice)(char *strDevName, PT_VideoDevice ptVideoDevice);
int (*ExitDevice)(PT_VideoDevice ptVideoDevice);
int (*GetFrame)(PT_VideoDevice ptVideoDevice, PT_PixelDatas ptPixelDatas);
int (*GetFormat)(PT_VideoDevice ptVideoDevice);
int (*PutFrame)(PT_VideoDevice ptVideoDevice, PT_PixelDatas ptPixelDatas);
int (*StartDevice)(PT_VideoDevice ptVideoDevice);
int (*StopDevice)(PT_VideoDevice ptVideoDevice);
struct VideoOpr *ptNext;
};
核心流程
實現思路說明
新增影片播放介面 video_page
開啟和關閉攝像頭裝置
需要注意的是關閉裝置後重新開啟需要重置緩衝區,理由如下:
- 狀態丟失:當關閉 V4L2 裝置時,裝置相關的狀態資訊會被清除。緩衝區的狀態(如已分配的記憶體、緩衝區的索引、資料的填充狀態等)也會丟失。重新開啟裝置後,這些狀態需要重新初始化,以確保緩衝區能夠正確地用於儲存新的影片資料。
- 防止錯誤資料和衝突:如果不重置緩衝區,之前關閉裝置時緩衝區中可能殘留的舊資料或者不完整的資料可能會干擾新的影片資料捕獲過程。而且,緩衝區的一些內部指標和計數器等資訊可能已經混亂,重新使用可能會導致程式崩潰或者產生錯誤的影片資料
static int V4l2StartDevice(PT_VideoDevice ptVideoDevice)
{
int iType = V4L2_BUF_TYPE_VIDEO_CAPTURE;
int iError;
iError = RequstBuffer(ptVideoDevice);
if (iError)
{
DBG_PRINTF("Unable to RequstBuffer.\n");
return -1;
}
iError = ioctl(ptVideoDevice->iFd, VIDIOC_STREAMON, &iType);
if (iError)
{
DBG_PRINTF("Unable to start capture.\n");
return -1;
}
return 0;
}
static int V4l2StopDevice(PT_VideoDevice ptVideoDevice)
{
int iType = V4L2_BUF_TYPE_VIDEO_CAPTURE;
int iError;
iError = FreeBuffer(ptVideoDevice);
if (iError)
{
DBG_PRINTF("Unable to FreeBuffer.\n");
return -1;
}
iError = ioctl(ptVideoDevice->iFd, VIDIOC_STREAMOFF, &iType);
if (iError)
{
DBG_PRINTF("Unable to stop capture.\n");
return -1;
}
return 0;
}
啟動執行緒持續處理資料並顯示
- 從佇列中獲取幀資料
- 轉換RGB格式
- 縮放併合併到當前介面緩衝的指定播放顯示區域
- 重新整理到顯示裝置
static void *VideoPlayThreadFunction(void *pVoid)
{
int iPixelFormatOfVideo;
int iPixelFormatOfDisp;
PT_VideoDevice ptVideoDevice = GetVideoDevice();
PT_VideoConvert ptVideoConvert;
PT_PixelDatas ptVideoBufCur;
T_PixelDatas tVideoBuf;
T_PixelDatas tConvertBuf;
T_PixelDatas tZoomBuf;
// 初始化操作
PT_VideoMem ptVideoMemCur = GetDevVideoMem(); //GetCurVideoMem();
int iLcdWidth, iLcdHeigt, iLcdBpp;
int iWidth = g_tVideoPicLayout.iBotRightX - g_tVideoPicLayout.iTopLeftX + 1;
int iHeight = g_tVideoPicLayout.iBotRightY - g_tVideoPicLayout.iTopLeftY + 1;
int iError;
int iCpature;
time_t timep;
struct tm *tmp;
char strTimeBuf[64];
int k;
GetDispResolution(&iLcdWidth, &iLcdHeigt, &iLcdBpp);
iPixelFormatOfDisp = (iLcdBpp == 16) ? V4L2_PIX_FMT_RGB565 : \
(iLcdBpp == 32) ? V4L2_PIX_FMT_RGB32 : \
0;
iPixelFormatOfVideo = GetVideoDeviceFormat(NULL);
ptVideoConvert = GetVideoConvertForFormats(iPixelFormatOfVideo, iPixelFormatOfDisp);
if (NULL == ptVideoConvert)
{
DBG_PRINTF("can not support this format convert\n");
return NULL;
}
memset(&tVideoBuf, 0, sizeof(tVideoBuf));
memset(&tConvertBuf, 0, sizeof(tConvertBuf));
tConvertBuf.iBpp = iLcdBpp;
memset(&tZoomBuf, 0, sizeof(tZoomBuf));
while(1){
pthread_mutex_lock(&g_tVideoPlayThreadMutex);
iError = g_bThreadExit;
pthread_mutex_unlock(&g_tVideoPlayThreadMutex);
if (iError)
return NULL;
/* 從佇列中讀入攝像頭資料 */
iError = ptVideoDevice->ptOPr->GetFrame(ptVideoDevice, &tVideoBuf);
if (iError)
{
DBG_PRINTF("Video GetFrame error!\n");
break;
}
ptVideoBufCur = &tVideoBuf;
/* 轉換為RGB格式 */
if (iPixelFormatOfVideo != iPixelFormatOfDisp)
{
iError = ptVideoConvert->Convert(iPixelFormatOfVideo, iPixelFormatOfDisp, &tVideoBuf, &tConvertBuf);
if (iError)
{
DBG_PRINTF("Convert %s error!\n", ptVideoConvert->name);
break;
}
ptVideoBufCur = &tConvertBuf;
}
/* 如果影像解析度大於顯示區域, 縮放 */
if (ptVideoBufCur->iWidth > iWidth || ptVideoBufCur->iHeight > iHeight)
{
/* 按比例縮放 */
k = (float)ptVideoBufCur->iHeight / ptVideoBufCur->iWidth;
tZoomBuf.iWidth = iWidth;
tZoomBuf.iHeight = iWidth * k;
if ( tZoomBuf.iHeight > iHeight)
{
tZoomBuf.iWidth = iHeight / k;
tZoomBuf.iHeight = iHeight;
}
tZoomBuf.iBpp = iLcdBpp;
tZoomBuf.iLineBytes = tZoomBuf.iWidth * tZoomBuf.iBpp / 8;
tZoomBuf.iTotalBytes = tZoomBuf.iLineBytes * tZoomBuf.iHeight;
if (!tZoomBuf.aucPixelDatas)
{
tZoomBuf.aucPixelDatas = malloc(tZoomBuf.iTotalBytes);
}
PicZoom(ptVideoBufCur, &tZoomBuf);
ptVideoBufCur = &tZoomBuf;
}
// 重新整理顯示到裝置上
PicMerge(g_tVideoPicLayout.iTopLeftX, g_tVideoPicLayout.iTopLeftY, ptVideoBufCur, &(ptVideoMemCur->tPixelDatas));
FlushVideoMemToDev(ptVideoMemCur);
// 截圖操作
pthread_mutex_lock(&g_tVideoCaptureThreadMutex);
iCpature = g_bThreadCpature;
g_bThreadCpature = 0;
pthread_mutex_unlock(&g_tVideoCaptureThreadMutex);
if (iCpature){
time (&timep);
tmp = gmtime(&timep);
strftime(strTimeBuf, sizeof(strTimeBuf), "picture-%Y%m%d-%H%M%S.bmp", tmp);
printf("photos name: %s\n", strTimeBuf);
Parser("bmp")->SaveFile(ptVideoBufCur->aucPixelDatas, ptVideoBufCur->iWidth, ptVideoBufCur->iHeight, ptVideoBufCur->iBpp, strTimeBuf);
}
PutVideoMem(ptVideoMemCur);
/* 重新加入空閒佇列 */
iError = ptVideoDevice->ptOPr->PutFrame(ptVideoDevice, &tVideoBuf);
if (iError)
{
DBG_PRINTF("Video PutFrame error!\n");
break;
}
}
if (tConvertBuf.aucPixelDatas)
free(tConvertBuf.aucPixelDatas);
if (tZoomBuf.aucPixelDatas)
free(tZoomBuf.aucPixelDatas);
return NULL;
}
執行緒負責迴圈讀取幀資料並處理後顯示到裝置上,同時基於執行緒鎖控制“退出”和“截圖”操作
截圖功能
截圖按鈕點選事件處理,將當前捕獲幀資料轉換為RGB,按照格式填充頭部資訊後儲存為bmp檔案,檔名採用當前時間。
static int SaveRgbAsBMP(unsigned char * pRgb, unsigned int dwWidth, unsigned int dwHeight, unsigned int dwBpp, char *strFilename)
{
BITMAPFILEHEADER tBmpFileHead;
BITMAPINFOHEADER tBmpInfoHead;
unsigned int dwSize;
unsigned char *pPos = 0;
FILE * fout;
memset(&tBmpFileHead, 0, sizeof(BITMAPFILEHEADER));
memset(&tBmpInfoHead, 0, sizeof(BITMAPINFOHEADER));
fout = fopen(strFilename, "w");
if (!fout)
{
DBG_PRINTF("Can't create output file %s\n", strFilename);
return -2;
}
tBmpFileHead.bfType = 0x4d42;
tBmpFileHead.bfSize = 0x36 + dwWidth * dwHeight * (dwBpp / 8);
tBmpFileHead.bfOffBits = 0x00000036;
tBmpInfoHead.biSize = 0x00000028;
tBmpInfoHead.biWidth = dwWidth;
tBmpInfoHead.biHeight = dwHeight;
tBmpInfoHead.biPlanes = 0x0001;
tBmpInfoHead.biBitCount = dwBpp;
tBmpInfoHead.biCompression = 0;
tBmpInfoHead.biSizeImage = dwWidth * dwHeight * (dwBpp / 8);
tBmpInfoHead.biXPelsPerMeter = 0;
tBmpInfoHead.biYPelsPerMeter = 0;
tBmpInfoHead.biClrUsed = 0;
tBmpInfoHead.biClrImportant = 0;
if (fwrite(&tBmpFileHead, 1, sizeof(tBmpFileHead), fout) != sizeof(tBmpFileHead))
{
DBG_PRINTF("Can't write BMP File Head to %s\n", strFilename);
return -3;
}
if (fwrite(&tBmpInfoHead, 1, sizeof(tBmpInfoHead), fout) != sizeof(tBmpInfoHead))
{
DBG_PRINTF("Can't write BMP File Info Head to %s\n", strFilename);
return -4;
}
dwSize = dwWidth * dwBpp / 8;
pPos = pRgb + (dwHeight - 1) * dwSize;
while (pPos >= pRgb)
{
if (fwrite(pPos, 1, dwSize, fout) != dwSize)
{
DBG_PRINTF("Can't write date to BMP File %s\n", strFilename);
return -5;
}
pPos -= dwSize;
}
fclose(fout);
return 0;
}