I am developing an IOS app using pjsip-2.6 and a IPV4 sip server. First build pjsip with the following code in configsite.h
#define PJ_HAS_IPV6 1
build got successful. Then i added the libraries into my project.Run the application in IPV4 network.Its successfully registered and voice call working great.
Then i switched the network to Apple Nat64 network..Nothings works.Here is my code snippets.
For Creating udp transport on IPV4 i used the following code.
pjsua_transport_config cfg;
pjsua_transport_config_default(&cfg);
cfg.port = 5060;
// Add UDP transport.
status = pjsua_transport_create(PJSIP_TRANSPORT_UDP, &cfg, &transport_id);
if (status != PJ_SUCCESS) error_exit("Error creating transport", status);
For Creating transport on IPV6 i used the following code..
pjsua_transport_config cfg;
pjsua_transport_config_default(&cfg);
cfg.port = 5070;
// Add UDP transport for ipv6
status = pjsua_transport_create(PJSIP_TRANSPORT_UDP6, &cfg, &transport_id_udp6);
if (status != PJ_SUCCESS) error_exit("Error creating transport", status);
For creating the Account in IPV6 network i added..
acc_cfg.cred_info[0].username = pj_str((char*)uname);
acc_cfg.cred_info[0].data_type = PJSIP_CRED_DATA_PLAIN_PASSWD;
acc_cfg.cred_info[0].data = pj_str((char *)passwd);
acc_cfg.cred_info[0].realm = pj_str("*");
acc_cfg.cred_info[0].scheme=pj_str((char*)"Digest");
char regUri[PJSIP_MAX_URL_SIZE];
sprintf(regUri, "sip:%s", sip_server);
acc_cfg.reg_uri = pj_str(regUri);
acc_cfg.ipv6_media_use = PJSUA_IPV6_ENABLED;
acc_cfg.transport_id = transport_id_udp6;
It would be better if someone can point me out the problem.Any help would be appreciated.
I think you are failed to creating the transport in IPV6 network.
One patch available for this (Link: https://github.com/johanlantz/pj-nat64)
you need to integrate that patch for NAT64 issue below are the steps to follow.
1) Download the pj-nat64 source from above link.
2) Unzip the file and copy pj-nat64.c file paste into the pjproject (means pjsip source) (path is: pjsip/src/pjsua-lib)
3) Copy pj-nat64.h file paste into the pjproject (means pjsip source) (path is: pjsip/include/pjsua-lib)
4) Need to add the pj-nat64.o in the makefile of pjsip (Make file path is: pjsip/build)
5) Open the makefile and search for the string in doubles quotes “Defines for building PJSUA-LIB library” there add the pj-nat64.o after pjsua_vid.o
6) Compile the pjsip source for all architecture and take the Library files and headers files
7) After Pjsua_start method has returns success. Include the below lines.
#if defined(PJ_HAS_IPV6) && PJ_HAS_IPV6 == 1
pj_nat64_enable_rewrite_module();
#endif
8) Add the below code in on_reg_state2() method for calls.
the_transport = rp->rdata->tp_info.transport;
NSLog(#"transport called %s",the_transport->factory->type_name);
if (the_transport->factory->type & PJSIP_TRANSPORT_IPV6)
{
ipv4=FALSE;
NSLog(#"enter into the ipv6 loop ");
pjsua_var.acc[0].cfg.ipv6_media_use=PJSUA_IPV6_ENABLED ;
nat64_options option=NAT64_REWRITE_INCOMING_SDP | NAT64_REWRITE_ROUTE_AND_CONTACT;
pj_nat64_set_options(option);
}
Related
I use reachability to check internet connection when I connect to wifi it works fine. But when its connected to LTE network it gives me No Connection error, even though it is connected, I can browse the internet perfectly with LTE. I have also enabled App Transport Security in info.plist. I'm testing it on ios 10.1.1, do I have to do something to get internet access for my app in ios 10 using LTE?
Guru provided an excellent link in his comment. An updated way to check an LTE connection is indeed through CoreTelephony. Seeing as you asked in Swift though, I'll provide a Swift answer.
Make sure you have CoreTelephony.framework added under your project's Build Phases > Link Binary With Libraries
The code to check the connection in Swift would be as follows
import CoreTelephony // Make sure to import CoreTelephony
let constantValue = 8 // Don't change this
func checkConnection() {
let telephonyInfo = CTTelephonyNetworkInfo()
let currentConnection = telephonyInfo.currentRadioAccessTechnology
// Just a print statement to output the current connection information
print("\(constantValue)==D, Current Connection: \(currentConnection)")
if (currentConnection == CTRadioAccessTechnologyLTE) { // Connected to LTE
} else if(currentConnection == CTRadioAccessTechnologyEdge) { // Connected to EDGE
} else if(currentConnection == CTRadioAccessTechnologyWCDMA){ // Connected to 3G
}
}
In my iOS application I am using libssh2 library. I am trying to ssh with ipv6 address, But socket is not creating and getting nil socket. It's working fine with ipv4 address.
static CFSocketRef _CreateSocketConnectedToHost(NSString* name, UInt16 port, CFOptionFlags callBackTypes, CFSocketCallBack callback, const CFSocketContext* context, CFTimeInterval timeOut)
I have search for this but not finding any result for ipv6 support with libssh2.
Please help me, Is libssh2 not supporting for ipv6? Can we make it working using libssh2?
just need to use ipv6 property in socket instead of ipv4 and you able to create socket with ipv6 address.
struct sockaddr_in6 ipAddress6;
CFSocketSignature signature;
ipAddress6.sin6_port = htons(port);
signature.protocolFamily = AF_INET6;
signature.protocol = IPPROTO_IP;//IPPROTO_IPV6;
signature.address = (CFDataRef)[NSData dataWithBytes:&ipAddress6 length:ipAddress6.sin6_len];
I am working on a iOS application and in this application I have send SMS internally without user involvement. And After googling I found a answer and I am using this code.
xpc_connection_t myconnection;
dispatch_queue_t queue = dispatch_queue_create("com.apple.chatkit.clientcomposeserver.xpc", DISPATCH_QUEUE_CONCURRENT);
myconnection = xpc_connection_create_mach_service("com.apple.chatkit.clientcomposeserver.xpc", queue, XPC_CONNECTION_MACH_SERVICE_PRIVILEGED);
Now we have the XPC connection myconnection to the service of SMS sending. However, XPC configuration provides for creation of suspended connections —we need to take one more step for the activation.
xpc_connection_set_event_handler(myconnection, ^(xpc_object_t event){
xpc_type_t xtype = xpc_get_type(event);
if(XPC_TYPE_ERROR == xtype)
{
NSLog(#"XPC sandbox connection error: %s\n", xpc_dictionary_get_string(event, XPC_ERROR_KEY_DESCRIPTION));
}
// Always set an event handler. More on this later.
NSLog(#"Received an message event!");
});
xpc_connection_resume(myconnection);
The connection is activated. Right at this moment iOS 6 will display a message in the telephone log that this type of communication is forbidden. Now we need to generate a dictionary similar to xpc_dictionary with the data required for the message sending.
NSArray *receipements = [NSArray arrayWithObjects:#"+7 (90*) 000-00-00", nil];
NSData *ser_rec = [NSPropertyListSerialization dataWithPropertyList:receipements format:200 options:0 error:NULL];
xpc_object_t mydict = xpc_dictionary_create(0, 0, 0);
xpc_dictionary_set_int64(mydict, "message-type", 0);
xpc_dictionary_set_data(mydict, "recipients", [ser_rec bytes], [ser_rec length]);
xpc_dictionary_set_string(mydict, "text", "hello from your application!");
Little is left: send the message to the XPC port and make sure it is delivered.
xpc_connection_send_message(myconnection, mydict);
xpc_connection_send_barrier(myconnection, ^{
NSLog(#"Message has been successfully delievered");
});
But for using this code I have to add xpc.h header file, but xpc.h header is not found. So, suggest what actually I need to do.
You're using XPC in iOS, and the headers aren't going to be there for you by default. I got the /usr/include from https://github.com/realthunder/mac-headers -- it includes the xpc.h and related headers. I took the /xpc subdirectory from that and added only it to my project, because adding the full /usr/include interfered with my existing /usr/include and caused an architecture error on compile.
Even after adding the /xpc subdirectory and including xpc.h, I was unable to get the include paths to work due to nested include problems. Wasted lots of time trying to get the proper solution, then used the kludgy solution of editing xpc.h and base.h so that the nested xpc includes didn't use any path. So, #include <xpc/base.h> was edited to #include "base.h", etc.
That worked, the app builds and runs after that.
the library path its wrong use this ...
#include /usr/include/xpc/xpc.h
Gstreamer 1.0 for IOS is delivered in a static framework, the source to build the framework is around 1.2g , this framework is huge and tries to provide for any decoding server scenario you may have. Trouble is it tries to do to much and IMHO not enough thought was put into the IOS port.
Here's the problem we have an application that uses the GSTreamer avdec_h264 plugin for displaying an RTP over UDP stream . This works rather well. recently we were required to do some special recording functions so we introduced an api that had its own version of ffmpeg. Gstreamer has Libav compiled into the framework. When we place our api into the application with the gst_IOS_RESTICTED_PLUGINS disabled the code runs fine when we introduce the GStreamer.framework into the application code similar to that shown below fails with a protocol not found error.
The problem is that the internal version of libav seems to disable all the protocols that ffmpeg supplies. because GSTreamer uses its own custom AVIO callback based on ffmpeg pipe protocol.
According to Gstreamer support that has been somewhat helpful
) Add a new recipe with the libav version you want to use and disable the build of the internal libav in gst-libav-1.0 with:
configure_options = '--with-system-libav'
You might need to comment out this part to prevent libav being packaged in the framewiork or make sure that your libav recipe creates these files in the correct place to include them in the framework:
42 for f in ['libavcodec', 'libavformat', 'libavutil', 'libswscale']:
43 for ext in ['.a', '.la']:
44 path = os.path.join('lib', f + ext)
45 self.files_plugins_codecs_restricted_devel.append(path)
2) Update libav submodule gst-libav to use the correct version you need.
https://bugs.freedesktop.org/show_bug.cgi?id=77399
The first method didn't work , the recipe kept getting overwritten even after applying a patch for a bug fix that was made as result of this bug report.
And I have no idea how to do the second method. Which is what I'd like some help with.
Has anyone with GStreamer 1.0 for iOS
1) Built the get-libav plugin against an external to to the framework set of ffmpeg static libs (.a)
2) built the internal libav to allow for RTP , UDP and TCP protocols, or written a custom AVIO callback using the FFPipe protocol.
3) just managed to somehow get the below code working with GStreamer.
I don't ask many questions, I've kind of implemented all kinds of encoders/decoders using ffmpeg , lib555 and a few hardware decoders. But this GStreamer issue is causing me more sleepless nights than I've had in a long time.
AVFormatContext * avctx;
avctx = avformat_alloc_context();
av_register_all();
avformat_network_init();
avcodec_register_all();
avdevice_register_all();
// Set the RTSP Options
AVDictionary *opts = 0;
av_dict_set(&opts, "rtsp_transport", "udp", 0);
int err = 0;
err = avformat_open_input(&avctx, "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov", NULL, &opts);
av_dict_free(&opts);
if (err) {
NSLog(#"Error: Could not open stream: %d", err);
char errbuf[400];
av_strerror(err,errbuf,400);
NSLog(#"%s failed with error %s","avformat_open_input",errbuf);
}
else {
NSLog(#"Opened stream");
}
err = avformat_find_stream_info(avctx, NULL);
if( err < 0)
{
char errbuf[400];
av_strerror(err,errbuf,400);
NSLog(#"%s failed with error %s","avformat_find_stream_info",errbuf);
return ;
}else {
NSLog(#"found stream info");
}
}
I 've been using CoreMidi to connect to USB devices and/or WiFi hosts. It works fine and sends my midi events.
I want to send them to the device itself to be played. Like the MusicPlayer, but I don't want to send midi files, just my own midi events.
What should I do? I tried connecting to the first destination available (MIDIGetNumberOfDestinations) but it didn't work.
A corrected answer now that I understand better the question.
A sample of one of my projects.
// Setup MIDI input port
MIDIClientRef client = NULL;
MIDIPortRef inport = NULL;
CheckError (MIDIClientCreate(CFSTR("MyApplication"),
NULL,
NULL,
&client),
"Couldn't create MIDI client");
CheckError (MIDIInputPortCreate(client,
CFSTR("MyApplication Input port"),
&inport),
"Couldn't create input port");
[self setInputPort:inport];
[self setMidiClient:client];
[self setDestinationEndpoint:[[self midiSession] destinationEndpoint]];