Blackberry Native SDK capture scrollview in invoke window - blackberry

I am writing a simple application on the BB10 simulator to capture the contents of an invoke preview window (which contains a scrollable view for emails). I want to capture the entire scrollview of the invoked window/email, not just what's on screen. In the code below, I can get a handle to the Window for the entire application and screen_read_window its contents, but how do I iterate over the invoke window's controls and find the handle to the scrollview, and capture that?
InvokeRequest request;
// Set the target app
request.setTarget("sys.pim.uib.email.previewer");
// Set the action that the target app should execute
request.setAction("bb.action.VIEW");
// Set the MIME type of the data
request.setMimeType("message/rfc822");
// Specify the location of the data
request.setUri(QUrl("pim:message/rfc822:" + QString::number(accountId) + ":" + QString::number(messageId)));
//InvokeTargetReply *reply =
invokeManager->invoke(request);
sleep(2);
screen_context_t screenshot_ctx = 0;
if (screen_create_context(&screenshot_ctx,
SCREEN_APPLICATION_CONTEXT) != 0) {
return;
}
screen_pixmap_t screen_pix;
screen_buffer_t screenshot_buf;
char *screenshot_ptr = NULL;
int screenshot_stride = 0;
int usage, format;
int size[2];
screen_create_pixmap(&screen_pix, screenshot_ctx);
usage = SCREEN_USAGE_READ | SCREEN_USAGE_NATIVE;
screen_set_pixmap_property_iv(screen_pix, SCREEN_PROPERTY_USAGE, &usage);
format = SCREEN_FORMAT_RGBA8888;
screen_set_pixmap_property_iv(screen_pix, SCREEN_PROPERTY_FORMAT, &format);
size[0] = 768;
size[1] = 1280;
screen_set_pixmap_property_iv(screen_pix, SCREEN_PROPERTY_BUFFER_SIZE, size);
screen_create_pixmap_buffer(screen_pix);
screen_get_pixmap_property_pv(screen_pix, SCREEN_PROPERTY_RENDER_BUFFERS,
(void**)&screenshot_buf);
screen_get_buffer_property_pv(screenshot_buf, SCREEN_PROPERTY_POINTER,
(void**)&screenshot_ptr);
screen_get_buffer_property_iv(screenshot_buf, SCREEN_PROPERTY_STRIDE,
&screenshot_stride);
screen_read_window(Application::instance()->mainWindow()->handle(), screenshot_buf, 0, NULL ,0);
QByteArray array;
int nbytes = size[0] * size[1] * 4;
write_bitmap_header(nbytes, array, size);
for (int i = 0; i < size[1]; i++)
{
array.append(screenshot_ptr + i * screenshot_stride, size[0] * 4);
}
QImage image = QImage::fromData(array, "BMP");
QFile outFile("shared/photos/temp1.jpeg");
outFile.open(QIODevice::WriteOnly);
image.save(&outFile, "JPEG");
//Close Email
invokeManager->closeChildCard();

Related

cv::imread image is not show on MFC's picture box

I'm building the dialog-based MFC application.
When I click the Button then I can choose the image from the file explorer and that image is also loaded into cv::imread(). And also shows in the Picture Control.
I can load and show an image in the Picture Control with the following code.
void CMFCApplication3Dlg::OnBnClickedButton1()
{
cv::Mat src = cv::imread("D:/source/repos/Testing_Photos/Large/cavalls.png");
Display(src);
}
But not with the following code.
void CMFCApplication3Dlg::OnBnClickedButton1()
{
TCHAR szFilter[] = _T("PNG (*.png)|*.png|JPEG (*.jpg)|*.jpg|Bitmap (*.bmp)|*.bmp||");
CFileDialog dlg(TRUE, NULL, NULL, OFN_HIDEREADONLY | OFN_OVERWRITEPROMPT, szFilter, AfxGetMainWnd());
if (dlg.DoModal() == IDC_BUTTON1)
{
CString cstrImgPath = dlg.GetPathName();
CT2CA pszConvertedAnsiString(cstrImgPath);
std::string strStd(pszConvertedAnsiString);
cv::Mat src = cv::imread(strStd);
Display(src);
}
}
And the following is the "Display" function.
void CMFCApplication3Dlg::Display(cv::Mat& mat) {
CStatic * PictureB = (CStatic *)GetDlgItem(IDC_STATIC);
CWnd* cwn = (CWnd *)GetDlgItem(IDC_STATIC);
CDC* wcdc = cwn->GetDC();
HDC whdc = wcdc->GetSafeHdc();
RECT rec;
cwn->GetClientRect(&rec);
cv::Size matSize;
matSize = cv::Size(rec.right, rec.bottom);
BITMAPINFO bitmapinfo;
bitmapinfo.bmiHeader.biBitCount = 24;
bitmapinfo.bmiHeader.biWidth = mat.cols;
bitmapinfo.bmiHeader.biHeight = -mat.rows;
bitmapinfo.bmiHeader.biPlanes = 1;
bitmapinfo.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
bitmapinfo.bmiHeader.biCompression = BI_RGB;
bitmapinfo.bmiHeader.biClrImportant = 0;
bitmapinfo.bmiHeader.biClrUsed = 0;
bitmapinfo.bmiHeader.biSizeImage = 0;
bitmapinfo.bmiHeader.biXPelsPerMeter = 0;
bitmapinfo.bmiHeader.biYPelsPerMeter = 0;
StretchDIBits(
whdc,
0, 0,
matSize.width, matSize.height,
0, 0,
mat.cols, mat.rows,
mat.data,
&bitmapinfo,
DIB_RGB_COLORS,
SRCCOPY
);
}
I am very new to MFC and I really don't have any idea where I'm wrong.
Please help me!
Thank you.
This is not the correct method to paint. The picture will be erased every time there is a paint request.
That's what happens with CFileDialog which forces another repaint immediately after you paint the image. You have to respond to OnPaint or OnDrawItem, or use CStatic::SetBitmap as noted in comment.
You can do this with CImage class, there is no need for OpenCV:
CImage img;
if(S_OK == img.Load(L"unicode.jpg"))
{
CStatic *control = (CStatic*)GetDlgItem(IDC_STATIC);
control->ModifyStyle(0, SS_BITMAP);
auto oldbmp = control->SetBitmap(img.Detach());
if(oldbmp)
DeleteObject(oldbmp);
}
Or you can use OpenCV to create HBITMAP handle.
Note that OpenCV does not handle Unicode filenames. CW2A converts Unicode to ANSI character encoding. This fails if one or more code points cannot be represented in the currently active code page. To work around it, we can open the file with CFile or std::ifstream, read it as binary, then open with cv::imdecode instead. Example:
//open the file from unicode path:
const wchar_t *filename = L"unicode.jpg";
std::ifstream fin(filename, std::ios::binary);
if(!fin.good())
return;
//read from memory
std::vector<char> vec(std::istreambuf_iterator<char>(fin), {});
cv::Mat src = cv::imdecode(vec, cv::IMREAD_COLOR);
if(!src.data)
return;
//create hbitmap
BITMAPINFOHEADER bi = { sizeof(bi), src.cols, -src.rows, 1, 24 };
CClientDC dc(this);
auto hbmp = CreateDIBitmap(dc, &bi, CBM_INIT, src.data, (BITMAPINFO*)&bi, 0);
//send hbitmap to control
auto control = (CStatic*)GetDlgItem(IDC_STATIC);
auto oldbmp = control->SetBitmap(hbmp);
if (oldbmp)
DeleteObject(oldbmp);

How to encode and decode audio using opus

I am trying integrate opus into my application, the encode and decode function returns positive value which means successfully, but the output audio can't play. Raw audio data can play as well.
Here is how I encode data. I use 4 bytes prefix to separate from each packet.
self.encoder = opus_encoder_create(24000, 1, OPUS_APPLICATION_VOIP, &opusError);
opus_encoder_ctl(self.encoder, OPUS_SET_BANDWIDTH(OPUS_BANDWIDTH_SUPERWIDEBAND));
- (void) encodeBufferList:(AudioBufferList *)bufferList {
BOOL success = TPCircularBufferProduceBytes(_circularBuffer, bufferList->mBuffers[0].mData, bufferList->mBuffers[0].mDataByteSize);
if (!success) {
NSLog(#"insufficient space in circular buffer!");
}
if (!_encoding) {
_encoding = YES;
dispatch_async(self.processingQueue, ^{
[self startEncodingLoop];
});
}
}
-(void)startEncodingLoop
{
int32_t availableBytes = 0;
opus_int16 *data = (opus_int16*)TPCircularBufferTail(_circularBuffer, &availableBytes);
int availableSamples = availableBytes / _inputASBD.mBytesPerFrame;
/*!
* Use dynamic duration
*/
// int validSamples[6] = {2.5, 5, 10, 20, 40, 60}; // in milisecond
// int esample = validSamples[0] * self.sampleRate / 1000;
// for (int i = 0; i < 6; i++) {
// int32_t samp = validSamples[i] * self.sampleRate / 1000;
// if (availableSamples < samp) {
// break;
// }
// esample = samp;
// }
/*!
* Use 20ms
*/
int esample = 20 * self.sampleRate / 1000;
if (availableSamples < esample) {
/*!
* Out of data. Finish encoding
*/
self.encoding = NO;
[self.eDelegate didFinishEncode];
return;
}
// printf("raw input value for packet \n");
// for (int i = 0; i < esample * self.numberOfChannels; i++) {
// printf("%d :", data[i]);
// }
int returnValue = opus_encode(_encoder, data, esample, _encoderOutputBuffer, 1000);
TPCircularBufferConsume(_circularBuffer, esample * sizeof(opus_int16) * self.numberOfChannels);
// printf("output encode \n");
// for (int i = 0; i < returnValue; i++) {
// printf("%d :", _encoderOutputBuffer[i]);
// }
NSMutableData *outputData = [NSMutableData new];
NSError *error = nil;
if (returnValue <= 0) {
error = [OKUtilities errorForOpusErrorCode:returnValue];
}else {
[outputData appendBytes:_encoderOutputBuffer length:returnValue * sizeof(unsigned char)];
unsigned char int_field[4];
int_to_char(returnValue , int_field);
NSData *header = [NSData dataWithBytes:&int_field[0] length:4 * sizeof(unsigned char)];
if (self.eDelegate) {
[self.eDelegate didEncodeWithData:header];
}
}
if (self.eDelegate) {
[self.eDelegate didEncodeWithData:outputData];
}
[self startEncodingLoop];
}
And here is decode function:
self.decoder = opus_decoder_create(24000, 1, &opusError);
opus_decoder_ctl(self.decoder, OPUS_SET_SIGNAL(OPUS_SIGNAL_VOICE));
opus_decoder_ctl(self.decoder, OPUS_SET_GAIN(10));
-(void)startParseData:(unsigned char*)data remainingLen:(int)len
{
if (len <= 0) {
[self.dDelegate didFinishDecode];
return;
}
int headLen = sizeof(unsigned char) * 4;
unsigned char h[4];
h[0] = data[0];
h[1] = data[1];
h[2] = data[2];
h[3] = data[3];
int packetLen = char_to_int(h);
data += headLen;
packetLen = packetLen * sizeof(unsigned char) * self.numberOfChannels;
[self decodePacket:data length:packetLen remainingLen:len - headLen];
}
-(void)decodePacket:(unsigned char*)inputData length:(int)len remainingLen:(int)rl
{
int bw = opus_packet_get_bandwidth(inputData); //TEST: return OPUS_BANDWIDTH_SUPERWIDEBAND here
int32_t decodedSamples = 0;
// int validSamples[6] = {2.5, 5, 10, 20, 40, 60}; // in milisecond
/*!
* Use 60ms
*/
int esample = 60 * self.sampleRate / 1000;
// printf("input decode \n");
// for (int i = 0; i < len; i++) {
// printf("%d :", inputData[i]);
// }
_decoderBufferLength = esample * self.numberOfChannels * sizeof(opus_int16);
int returnValue = opus_decode(_decoder, inputData, len, _outputBuffer, esample, 1);
if (returnValue < 0) {
NSError *error = [OKUtilities errorForOpusErrorCode:returnValue];
NSLog(#"decode error %#", error);
inputData += len;
[self startParseData:inputData remainingLen:rl - len];
return;
}
decodedSamples = returnValue;
NSUInteger length = decodedSamples * self.numberOfChannels;
// printf("raw decoded data \n");
// for (int i = 0; i < length; i++) {
// printf("%d :", _outputBuffer[i]);
// }
NSData *audioData = [NSData dataWithBytes:_outputBuffer length:length * sizeof(opus_int16)];
if (self.dDelegate) {
[self.dDelegate didDecodeData:audioData];
}
inputData += len;
[self startParseData:inputData remainingLen:rl - len];
}
Please help me to point out what I am missing. An example would be great.
I think the problem is on the decode side:
You pass 1 as the fec argument to opus_decode(). This asks the decoder to generate the full packet duration's worth of data from error correction data in the current packet. I don't see any lost packet tracking in your code, so 0 should be passed instead. With that change your input and output duration should match.
You configure the decoder for mono output, but later use self.numberOfChannels in length calculations. Those should match or you may get unexpected behaviour.
OPUS_SET_SIGNAL doesn't do anything in opus_decoder_ctl() but it will just return OPUS_UNIMPLEMENTED without affecting behaviour.
Opus packets can be up to 120 ms in duration, so your limit of 60 ms could fail to decode some streams. If you're only talking to your own app that won't cause a problem the way you've configured it, since libopus defaults to 20ms frames.
I found what the problem is. I have set the audio format is float kAudioFormatFlagIsPacked|kAudioFormatFlagIsFloat;. I should use opus_encode_float and opus_decode_float instead of opus_encode opus_decode.
As #Ralph says, we should use fec = 0 in opus_decode. Thanks to #Ralph.
One thing I notice is that you're treating the return value of opus_encode() as a number of samples encoded, when it's the number of bytes in the compressed packet. that means you're writing 50% or 75% garbage data from the end of _encoderOutputBuffer into your encoded stream.
Also make sure _encoderOutputBuffer has room for the hard-coded 1000 byte packet-length limit you're passing in.

Wrong processing of GetDIBits() builder c++

I use GetDIBits() to retrieve the bits of bitmap and copies them into a buffer. The function doesn't fail (the result is different to NULL), but I get wrong height of bitmap and erroneous buffer.
This is a part of my code:
HDC hdcMemDC = CreateCompatibleDC(hDC); // (hDC = hDC = BeginPaint(hwnd, &ps): i get it in WM_Paint)
int l_uiWidth = 400;
int l_uiHeight = 120;
HBITMAP hbmp = CreateCompatibleBitmap(hDC, l_uiWidth, l_uiHeight);
HGDIOBJ oldhbmp = SelectObject(hdcMemDC,hbmp);
BITMAPINFO bi;
bi.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
bi.bmiHeader.biWidth = l_uiWidth;
bi.bmiHeader.biHeight = l_uiHeight;
bi.bmiHeader.biPlanes = 1;
bi.bmiHeader.biBitCount = 8;
bi.bmiHeader.biCompression = BI_RGB;
bi.bmiHeader.biSizeImage = 0;
bi.bmiHeader.biXPelsPerMeter = 0;
bi.bmiHeader.biYPelsPerMeter = 0;
bi.bmiHeader.biClrUsed = 256;
bi.bmiHeader.biClrImportant = 0;
BYTE *l_ImagePDM = new BYTE[l_uiWidth * l_uiHeight];
GetDIBits(hdcMemDC,hbmp,0,l_uiHeight,l_Image,&bi,DIB_RGB_COLORS);
Please Help me!
What's wrong with my code ?
You have not drawn anything on the bitmap that you are querying the bits for. CreateCompatibleBitmap() does not make a copy of the pixels of the source HDC. It merely allocates an HBITMAP that is compatible with the specified HDC, which then allows you to select the HBITMAP into that HDC. But you still have to draw something onto the bitmap to make its content meaningful before you can then query its bits.
When you use C++ Builder you can use Graphics unit for processing graphics.
Example of copying something from window:
void __fastcall TForm1::Button1Click(TObject *Sender)
{
// Create a bitmap
Graphics::TBitmap* bit = new Graphics::TBitmap();
bit->PixelFormat = pf24bit;
bit->HandleType = bmDIB;
bit->Width = 200;
bit->Height = 200; // or bit->SetSize(200, 200) - newer versions C++ Builder
// Copy something from this Form (from window)
bit->Canvas->CopyRect(TRect(0, 0, 200, 200), this->Canvas, TRect(0, 0, 200, 200));
// Do something with bitmap data
RGBTRIPLE* line;
for (int i = 0; i < bit->Height; i++)
{
// Get memory address from line i
line = (RGBTRIPLE*) bit->ScanLine[i];
// Change 5'th pixel
line[5].rgbtRed = 255;
}
// Get whole bitmap memory. Bitmap data are stored upside down and size of each row is rounded
// up to a multiple of 4 bytes.
unsigned char* bitmem = (unsigned char*)(bit->ScanLine[ bit->Height-1 ]);
//...
// Draw bitmap on Form
Canvas->Draw(0, 200, bit);
delete bit;
}
When you have device context you can use TCanvas like this:
void __fastcall TForm1::Button2Click(TObject *Sender)
{
// Create device context
HDC hdc = GetDC(this->Handle); // or GetWindowDC - whole window with frame
// Create canvas and associate device context
TCanvas* canv = new TCanvas();
canv->Handle = hdc;
// Create a bitmap
Graphics::TBitmap* bit = new Graphics::TBitmap();
bit->PixelFormat = pf24bit;
bit->HandleType = bmDIB;
bit->Width = this->ClientWidth;
bit->Height = this->ClientHeight; // or bit->SetSize(w, h) - newer versions C++ Builder
// Copy window content
TRect r(0, 0, this->ClientWidth, this->ClientHeight);
bit->Canvas->CopyRect(r, canv, r); // USEING TCanvas
// Release context and delete canvas
ReleaseDC(this->Handle, hdc);
delete canv;
// Do something with the bitmap
bit->SaveToFile("screenshot.bmp");
delete bit;
}
You can also use TCanvas for device context from BeginPaint:
canv->Handle = BeginPaint(hwnd, &ps);

how to get Dc from directx3d?

I need to get Device Context (DC ) from directx3d. Here some code snap.
1.CREATE DEVICE:
int windowWidth = 640;
int windowHeight = 480;
IDirect3D9* direct3D9 = Direct3DCreate9(D3D_SDK_VERSION);
if(direct3D9 == NULL)
{
return FALSE;
}
D3DDISPLAYMODE *d3ddisplayMode =(D3DDISPLAYMODE *)calloc(1,sizeof(D3DDISPLAYMODE));
hr = direct3D9->GetAdapterDisplayMode(D3DADAPTER_DEFAULT,d3ddisplayMode);
if(hr != D3D_OK)
{
free(d3ddisplayMode);
direct3D9->Release();
return FALSE;
}
D3DPRESENT_PARAMETERS *d3dpresentParam =(D3DPRESENT_PARAMETERS*)calloc(1,sizeof(D3DPRESENT_PARAMETERS));
d3dpresentParam->Windowed = TRUE;
d3dpresentParam->hDeviceWindow = NULL;
d3dpresentParam->SwapEffect = D3DSWAPEFFECT_DISCARD;
d3dpresentParam->BackBufferFormat = d3ddisplayMode->Format;
d3dpresentParam->BackBufferWidth = windowWidth;
d3dpresentParam->BackBufferHeight = windowHeight;
d3dpresentParam->BackBufferCount = 1;
free(d3ddisplayMode);
hr = direct3D9->CreateDevice(D3DADAPTER_DEFAULT,D3DDEVTYPE_HAL,NULL,D3DCREATE_SOFTWARE_VERTEXPROCESSING,d3dpresentParam,&direct3D9Device);
2.CRETAE TEXTURE:
hr = D3DXCreateTexture(direct3D9Device,bmpWidth,bmpHeight,1,0,D3DFMT_X8R8G8B8,D3DPOOL_MANAGED,&pTexture);
3.DISPLAY IMAGE:
float left = 0,top =0,width =640,height=480;
direct3D9Device->BeginScene();
D3DXMATRIX mat;
D3DXVECTOR3 pos;
pos.x = (bmpWidth * left) / width;
pos.y = (bmpHeight * top) / height;
pos.z = 0;
d3dxSprite->Begin(D3DXSPRITE_ALPHABLEND);
D3DXVECTOR2 scaling((width/bmpWidth),(height/bmpHeight));
if(pTexture == direct3DTextureRemote )
{
D3DXVECTOR2 spriteCentre((width/2),(height/2));
D3DXMatrixTransformation2D(&mat,NULL,0.0,&scaling,&spriteCentre,NULL,NULL);
}
else
{
D3DXMatrixTransformation2D(&mat,NULL,0.0,&scaling,NULL,NULL,NULL);
}
d3dxSprite->SetTransform(&mat);
d3dxSprite->Draw(pTexture,NULL,NULL,&pos,0xFFFFFFFF);
d3dxSprite->End();
direct3D9Device->EndScene();
direct3D9Device->Present( NULL, NULL, NULL, NULL );
Now Working probely. I can get dc from window like HDC hdc = ::GetDC(hwnd) but in my case if there is no window(i.e. windowless) then how can i get DC from directx. please give some piece of code get DC from directx device.
Call GetDC with null as argument:
HDC hdc = ::GetDC(0);
Quote from MSDN:
Parameters
hWnd [in]
A handle to the window whose DC is to be retrieved.
If this value is NULL, GetDC retrieves the DC for the entire screen.
Edit:
As we know now that you are using NPAPI, here is a possible solution:
Call NPAPI function NPN_GetValue() with NPNVnetscapeWindow parameter. Returned HWND is a handle to plug-in drawing surface. Use it when create DirectX device and to retrieve HDC.
Alternatively, you could try to retrieve the back buffer (IDirect3DSurface9) via IDirect3DDevice9::GetRenderTarget() method and then retrieve its HDC via IDirect3DSurface9::GetDC() method.

OpenCv image unstable while running AES decryption

I am trying to capture video from webcam using Opencv and transmit it over TCP. In addition, I wanted to encrypt the video using AES. But whenever run the AES decrpt function the video is unstable.
I am using the opencv over tcp example and AES example
Whenever I run this function:
img->imageData = aes_decrypt(&de, img->imageData, &imgsize);
my video gets unstable.
I have attached the code segment where I wrote the function.
/* start receiving images*/
while(1)
{
/* get raw data */
for (i = 0; i < imgsize; i += bytes) {
if ((bytes = recv(sock, sockdata + i, imgsize - i, 0)) == -1) {
quit("recv failed", 1);
}
}
pthread_mutex_lock(&mutex);
for (i = 0, k = 0; i < img->height; i++) {
for (j = 0; j < img->width; j++) {
((uchar*)(img->imageData + i * img->widthStep))[j] = sockdata[k++];
}
}
img->imageData = aes_decrypt(&de, img->imageData, &imgsize);
is_data_ready = 1;
pthread_mutex_unlock(&mutex);
/* have we terminated yet? */
pthread_testcancel();
/* no, take a rest for a while */
usleep(1000);
}
This is my first post, sorry for my bad English and format of the post.

Resources