Save BIO into char* (from SMIME_write_CMS) - ios

I want to save (pipe/copy) a BIO into a char array.
When I know the size it works, but otherwise not.
For example, I can store the content of my char* into a BIO using this
const unsigned char* data = ...
myBio = BIO_new_mem_buf((void*)data, strlen(data));
But when I try to use SMIME_write_CMS which takes a BIO (what I've created before) for the output it doesn't work.
const int SIZE = 50000;
unsigned char *temp = malloc(SIZE);
memset(temp, 0, SIZE);
out = BIO_new_mem_buf((void*)temp, SIZE);
if (!out) {
NSLog(#"Couldn't create new file!");
assert(false);
}
int finished = SMIME_write_CMS(out, cms, in, flags);
if (!finished) {
NSLog(#"SMIME write CMS didn't succeed!");
assert(false);
}
printf("cms encrypted: %s\n", temp);
NSLog(#"All succeeded!");
The OpenSSL reference uses a direct file output with the BIO.
This works but I can't use BIO_new_file() in objective-c... :-/
out = BIO_new_file("smencr.txt", "w");
if (!out)
goto err;
/* Write out S/MIME message */
if (!SMIME_write_CMS(out, cms, in, flags))
goto err;
Do you guys have any suggestion?

I would suggest trying to use SIZE-1, that way you are guaranteed that it is NULL terminated. Otherwise, it is possible that it is just over running the buffer.
out = BIO_new_mem_buf((void*)temp, SIZE-1);
Let me know if that helps.
Edit:
When using BIO_new_mem_buf() it is a read only buffer, so you cannot write to it. If you want to write to memory use:
BIO *bio = BIO_new(BIO_s_mem());

Related

Trying to write to file using g_file_set_contents

I just want to use GNOME glib functions to simply write and read a file. I think my syntaxes are wrong in calling the functions. I tried to open a file with g_fopen("filenam.txt", "w"); but it didnt create any file. I also used g_file_set_contents and I am trying to save my Gstring s into a file file.txt with code as
static void events_handler(const uint8_t *pdu, uint16_t len, gpointer user_data)
{
uint8_t *opdu;
uint16_t handle, i, olen;
size_t plen;
//GString *s;
const gchar *s;
gssize length;
length = 100;
handle = get_le16(&pdu[1]);
switch (pdu[0]) {
case ATT_OP_HANDLE_NOTIFY:
s = g_string_new(NULL);
//g_string_printf(s, "Movement data = 0x%04x value: ",handle);
g_file_set_contents("file.txt", s, 100, NULL);
break;
case ATT_OP_HANDLE_IND:
s = g_string_new(NULL);
g_string_printf(s, "Indication handle = 0x%04x value: ",handle);
break;
default:
error("Invalid opcode\n");
return;
}
for (i = 3; i < len; i++)
g_string_append_printf(s, "%02x ", pdu[i]);
rl_printf("%s\n", s->str);
g_string_free(s, TRUE);
if (pdu[0] == ATT_OP_HANDLE_NOTIFY)
return;
opdu = g_attrib_get_buffer(attrib, &plen);
olen = enc_confirmation(opdu, plen);
if (olen > 0)
g_attrib_send(attrib, 0, opdu, olen, NULL, NULL, NULL);
}
You're conflating GString* and gchar*. The g_string_*() functions expect a GString*, and g_file_set_contents() expects gchar*. If you want the raw data, use the str field.
Also, I suggest turning on some more warnings on your compiler, since it really should be complaining during development if you try to do this. Passing -Wall should do the trickā€¦

Play a Video from MemoryStream, Using FFMpeg

I'm having a hard time, searching how to play a video file from a TMemoryStream (or a similar buffer in memory) using FFMpeg. I've seen many things, including UltraStarDX, expensive FFMpeg components for Delphi and so on.
One component called FFMpeg Vcl Player claims to play video formats from a memory stream. I downloaded the trial version and I guess it uses CircularBuffer.pas for that matter (maybe).
Does any one know how to do this?
Edit:
Now the better question is how to play an encrypted video file, using FFMpeg or similar libraries.
To play video from memory stream, you can use custom AVIOContext.
static const int kBufferSize = 4 * 1024;
class my_iocontext_private
{
private:
my_iocontext_private(my_iocontext_private const &);
my_iocontext_private& operator = (my_iocontext_private const &);
public:
my_iocontext_private(IInputStreamPtr inputStream)
: inputStream_(inputStream)
, buffer_size_(kBufferSize)
, buffer_(static_cast<unsigned char*>(::av_malloc(buffer_size_))) {
ctx_ = ::avio_alloc_context(buffer_, buffer_size_, 0, this,
&my_iocontext_private::read, NULL, &my_iocontext_private::seek);
}
~my_iocontext_private() {
::av_free(ctx_);
::av_free(buffer_);
}
void reset_inner_context() { ctx_ = NULL; buffer_ = NULL; }
static int read(void *opaque, unsigned char *buf, int buf_size) {
my_iocontext_private* h = static_cast<my_iocontext_private*>(opaque);
return h->inputStream_->Read(buf, buf_size);
}
static int64_t seek(void *opaque, int64_t offset, int whence) {
my_iocontext_private* h = static_cast<my_iocontext_private*>(opaque);
if (0x10000 == whence)
return h->inputStream_->Size();
return h->inputStream_->Seek(offset, whence);
}
::AVIOContext *get_avio() { return ctx_; }
private:
IInputStreamPtr inputStream_; // abstract stream interface, You can adapt it to TMemoryStream
int buffer_size_;
unsigned char * buffer_;
::AVIOContext * ctx_;
};
//// ..........
/// prepare input stream:
IInputStreamPtr inputStream = MyCustomCreateInputStreamFromMemory();
my_iocontext_private priv_ctx(inputStream);
AVFormatContext * ctx = ::avformat_alloc_context();
ctx->pb = priv_ctx.get_avio();
int err = avformat_open_input(&ctx, "arbitrarytext", NULL, NULL);
if (err < 0)
return -1;
//// normal usage of ctx
//// avformat_find_stream_info(ctx, NULL);
//// av_read_frame(ctx, &pkt);
//// etc..
You can waste your time rewriting FFMPEG from C++ to Delphi, or mess with wrapper libraries.
Or if you're just interested in playing a video in Delphi, then check out Mitov's VideoLab components.
http://www.mitov.com/products/videolab#components
If you want play Stream from memory you can make a virtual memory. I suggest BoxedAppSdk.
This will help you to make a virtual drive with virtual files that you can write on it and then give the virtual path to the player component that you have.
BoxedApp is not free but it is really awesome and very simple in use!

Gstreamer - appsrc push model

I'm developing a C application (under linux) that receives a raw h.264 stream and should visualize this stream using gstreamer APIs.
I am a newbie with GStreamer so maybe I am doing huge stupid mistakes or ignoring well-known stuff, sorry about that.
I got a raw h264 video (I knew it was the exact same format I need) and developed an application that plays it. It correctly works with appsrc in pull mode (when need data is called, I get new data from the file and perform push-buffer).
Now I'm trying to do the exact same thing but in push mode, this is basically because I don't have a video, but a stream. So I have a method inside my code that will be called every time new data (in the form of an uint8_t buffer) arrives, and this is my video source.
I googled my problem and had a look at the documentation, but I found no simple code snippets for my use case, even if it seems a very simple one. I understood that I have to init the pipeline and appsrc and then only push-buffer when I have new data.
Well, I developed two methods: init_stream() for pipeline/appsrc initialization and populate_app(void *inBuf, size_t len) to send data when they are available.
It compiles and correctly runs, but no video:
struct _App
{
GstAppSrc *appsrc;
GstPipeline *pipeline;
GstElement *h264parse;
GstElement *mfw_vpudecoder;
GstElement *mfw_v4lsin;
GMainLoop *loop;
};
typedef struct _App App;
App s_app;
App *app = &s_app;
static gboolean bus_message (GstBus * bus, GstMessage * message, App * app)
{
GST_DEBUG ("got message %s", gst_message_type_get_name (GST_MESSAGE_TYPE (message)));
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR:
g_error ("received error");
g_main_loop_quit (app->loop);
break;
case GST_MESSAGE_EOS:
g_main_loop_quit (app->loop);
break;
default:
break;
}
return TRUE;
}
int init_stream()
{
GstBus *bus;
gst_init (NULL, NULL);
fprintf(stderr, "gst_init done\n");
/* create a mainloop to get messages */
app->loop = g_main_loop_new (NULL, TRUE);
fprintf(stderr, "app loop initialized\n");
app->pipeline = gst_parse_launch("appsrc name=mysource ! h264parse ! mfw_vpudecoder ! mfw_v4lsin", NULL);
app->appsrc = gst_bin_get_by_name (GST_BIN(app->pipeline), "mysource");
gst_app_src_set_stream_type(app->appsrc, GST_APP_STREAM_TYPE_STREAM);
gst_app_src_set_emit_signals(app->appsrc, TRUE);
fprintf(stderr, "Pipeline and appsrc initialized\n");
/* Create Bus from pipeline */
bus = gst_pipeline_get_bus(app->pipeline);
fprintf(stderr, "bus created\n");
/* add watch for messages */
gst_bus_add_watch (bus, (GstBusFunc) bus_message, app);
gst_object_unref(bus);
fprintf(stderr, "bus_add_watch done\n");
GstCaps* video_caps = gst_caps_new_simple ("video/x-h264",
"width", G_TYPE_INT, 800,
"height", G_TYPE_INT, 480,
"framerate", GST_TYPE_FRACTION, 25,
1, NULL);
gst_app_src_set_caps(GST_APP_SRC(app->appsrc), video_caps);
/* go to playing and wait in a mainloop. */
gst_element_set_state ((GstElement*) app->pipeline, GST_STATE_PLAYING);
fprintf(stderr, "gst_element_set_state play\n");
/* this mainloop is stopped when we receive an error or EOS */
g_main_loop_run (app->loop);
fprintf(stderr, "g_main_loop_run called\n");
gst_element_set_state ((GstElement*) app->pipeline, GST_STATE_NULL);
fprintf(stderr, "gst_element_set_state GST_STATE_NULL\n");
/* free the file */
// g_mapped_file_unref (app->file);
gst_object_unref (bus);
g_main_loop_unref (app->loop);
return 0;
}
void populateApp(void *inBuf , size_t len) {
guint8 *_buffer = (guint8*) inBuf;
GstFlowReturn ret;
GstBuffer *buffer = gst_buffer_new();
GstCaps* video_caps = gst_caps_new_simple ("video/x-h264",
"width", G_TYPE_INT, 800,
"height", G_TYPE_INT, 480,
"framerate", GST_TYPE_FRACTION, 25,
1, NULL);
gst_buffer_set_caps(buffer, video_caps);
GST_BUFFER_DATA (buffer) = _buffer;
GST_BUFFER_SIZE (buffer) = len;
// g_signal_emit_by_name (app->appsrc, "push-buffer", buffer, &ret);
ret = gst_app_src_push_buffer(GST_APP_SRC(app->appsrc), buffer);
gst_buffer_unref (buffer);
}
As said, I am a total newbie at GStreamer so there's a lot of cut-and-paste code from the internet, but IMHO it should work.
Do you see any issues?
It's not clear how you are calling populateApp, but you need to call that repeatedly as you have data to push to your pipeline. This can be done in a separate thread from the one blocked by g_main_loop_run, or you can restructure your program to avoid using GMainLoop.

tmp directory path inside 'C' library

My cocoa application uses one library written in 'C' which is tryings write file at '/tmp' path. This creates sandbox violations. In Cocoa we can use 'NSTemporaryDirectory' API. To fix sandbox violation Is it safe to use 'tmpfile' API in sandboxed environment? Are there in any other solutions?
EDITED After actually testing it
No, tmpnam() won't work and I think the only way to get a temporary filename is to provide a .m file with your library specifically for use with iOS and OSX, which can be used return the temporary directory as a C-String:
apple.h:
#pragma once
extern size_t getTemporaryDirectory(char *buffer, size_t len);
apple.m:
size_t getTemporaryDirectory(char *buffer, size_t len)
{
#autoreleasepool
{
NSString *tempDir = NSTemporaryDirectory();
if (tempDir != nil)
{
const char *utf = [tempDir UTF8String];
strncpy(buffer, utf, len);
return strlen(utf);
}
}
return 0;
}

DirectX11 CreateWICTextureFromMemory Using PNG

I've currently got textures loading using CreateWICTextureFromFile however I'd like a little more control over it, and I'd like to store images in their byte form in a resource loader. Below is just two sets of test code that return two separate results and I'm looking for any insight into a possible solution.
ID3D11ShaderResourceView* srv;
std::basic_ifstream<unsigned char> file("image.png", std::ios::binary);
file.seekg(0,std::ios::end);
int length = file.tellg();
file.seekg(0,std::ios::beg);
unsigned char* buffer = new unsigned char[length];
file.read(&buffer[0],length);
file.close();
HRESULT hr;
hr = DirectX::CreateWICTextureFromMemory(_D3D->GetDevice(), _D3D->GetDeviceContext(), &buffer[0], sizeof(buffer), nullptr, &srv, NULL);
As a return for the above code I get Component not found.
std::ifstream file;
ID3D11ShaderResourceView* srv;
file.open("../Assets/Textures/osg.png", std::ios::binary);
file.seekg(0,std::ios::end);
int length = file.tellg();
file.seekg(0,std::ios::beg);
std::vector<char> buffer(length);
file.read(&buffer[0],length);
file.close();
HRESULT hr;
hr = DirectX::CreateWICTextureFromMemory(_D3D->GetDevice(), _D3D->GetDeviceContext(), (const uint8_t*)&buffer[0], sizeof(buffer), nullptr, &srv, NULL);
The above code returns that the image format is unknown.
I'm clearly doing something wrong here, any help is greatly appreciated. Tried finding anything even similar on stackoverflow, and google to no avail.
Hopefully someone trying to do the same thing will find this solution.
Below is the code I used to solve this problem.
std::basic_ifstream<unsigned char> file("image.png", std::ios::binary);
if (file.is_open())
{
file.seekg(0,std::ios::end);
int length = file.tellg();
file.seekg(0,std::ios::beg);
unsigned char* buffer = new unsigned char[length];
file.read(&buffer[0],length);
file.close();
HRESULT hr;
hr = DirectX::CreateWICTextureFromMemory(_D3D->GetDevice(), _D3D->GetDeviceContext(), &buffer[0], (size_t)length, nullptr, &srv, NULL);
}
The important change being (size_t)length in CreateWICTextureFromMemory
It was indeed a stupid error.

Resources