I want to get stream of an ip-cam on my iPhone/iPad and want to display it on screen. By R&D i found that ffmpeg is the only way to achieve it but i found nothing on ffmpeg. Is there any other way to achieve it or a confirmed way to get compiled ffmpeg on mac please mention that. Material regarding how to use ffmepg or source code example will be highly appreciated.
Is there nothing built-in framework to achieve it if not then please mention if there is any free framework/sdk to achieve this functionality.
Thanks
There are actually a few.
here are some links
http://www.streammore.tv/
http://www.live555.com/
I am sure if you google you can find more.
I cannot only address the first one, because that is ours, but I didn't want this to sound purely like self promotion.
Related
Currently there is a flutter app and it needs to be protect from screenshot or recording
but when I search about this, there is no way to implement this in a official way
but it seems there are some tricks (like 60fps? I know the concept but I don`t know how to implement this)
you can see the black screen when record the video on Netflix also (they prevent in some ways)
how could I achieve this? thanks
there is a package called window manager which does just the thing that you are asking it restricts external apps from recording as well as does not allow screenshots to be taken
a detailed tutorial is given in this article.
I have the following issue, and I have found a few topics here talking about it but none of these is actually answering my question.
I'm pretty new with iOS development, I searched the apple documentation but didn't found anything useful
I need to get the audio Sample/Buffer/Stream from the headphone microphone, in a manipulable variable or something like that. Then push it back to the headphones. That I can hear my voice when I'm talking.
I found things about AVFoundation but nothing more.
I know it’s possible to do that but I did not find how to
Can anybody help me further ?
Notice that the language you are looking for is swift,so you might can finding something useful on EZAudio,even this project was deprecated at June 13,2016.You might want to check the example which called PassThrough in this project,this example has the function "Hearing the voice while talking",and this project was written with swift.Wish could help.
As the title says, im simply wondering if it is possible to use the GPS tracking and POI part of junaio, and att the same time use the scan functionality to scan and recognize images. Im working with a group at a project which demands that we use both functionalities, and we are at the moment stuck on trying to send 2 XML documents, causing the server to return nothing at all. I simply want to know if it is possible to use both functionalities in the same channel, and I would greatly appriciate if someone would point me in a direction which could help me solve our problems, since I've been able to find absolutley nothing on my own. Thanks beforehand!
Scan + GPS/compass is not possible at the moment.
However, it's possible to use GPS/compass tracking and continuous visual search at the same time. This might be the closest thing to your requirements.
You might find more information on http://helpdesk.metaio.com
I want to make an iPad app that
analyze the data traffic using: "tcpdump"
The app should be somehow a implementation/adaptation/wrapper
the/for the "tcpdump" command.
I zapped through the http://www.tcpdump.org/,
but I want to save time,
so I want ask you for some guidelines in order to solve this.
Is there any wrapper "libpcap" library for objective-c?
Or any other API that handles the "tcpdump" command in iOS.
How do I use a C/C++ library in an iPhone/iPad app?
Thanks in advance.
I didn't find an Obj-C wrapper for pcap either. Not surprising considering some points raised in this answer. #Guy Harris points out the problem specifically: unless you're running on a jailbroken device, you're going to be lacking the permissions to read the data.
This question (specifically this answer) suggests that you just properly name the files, then compile and link
Good luck.
i´ve just created an iOS app that contains lots of media.
Now i just want to know if it´s possible to get an overview of all single files as a report or something like that?
(e.g. Unity3d has this kind of feature. You can see all textures, shaders and so. Maybe Apple already included something...)
Thanks for tips on that..
Slender(App Store Link) might do what you are looking for.