Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I would like to receive Server-Sent Events in my native iOS app however am not using webkit/Safari. From what I've found, NSURLConnection is poor fit as it chunks response . I've also looked at ZTWebSocket library (obviously nice, but I'm seeking SSE, not web sockets). Would CocoaAsyncSocket be appropriate? Or limited to pure TCP Socket communication?
I have a sneaking suspicion that I am missing something obvious, or there'd be a library or sample for this already. Thanks in advance.
SSE is a HTTP technology in the sense that it is bound to an open HTTP connection. CocoaAsyncSocket are raw TCP/UDP sockets and do not know anything about HTTP. So no, CocoaAsyncSocket won't give you SSE, as you suspected.
I don't know of any standalone implementation of SSE (in the spirit of standalone Websocket implementations), which is maybe what you are searching for. But I don't know either whether that would make sense at all, since SSE is sending messages in form of DOM-events which are most sensible in the context of HTML, as far as I can see.
If all you want to achieve is sending messages to your iOS app and you are free in the choice of technology, raw sockets would do. But Websockets more likely could suit your needs, depending on what you want. Take a look at SocketRocket.
After some more research on this, it's my opinion that the best way to implement Server Sent Events on iOS without WebKit is to use a customized NSConnection/NSRequest toolset. I settled on ASIHTTPRequest . This library allows you to explicitly control persistence attribute on connection object (essential), control data as it is received over the stream, store responses (e.g. in local files), etc.
Not to mention it contains lots of other handy extensions/customizations in the realm of networking (improved Reachability observer, a simplified API for async, a queuing feature, even ability to load entire web pages (CSS, js and all).
On the server side, I'm using cyclone-sse (tornado) and nginx (as a reverse proxy). Pretty exciting, now I can see my SSEs pushed simulataneously to both my iOS simulator and a browser subscriber. Cyclone even handles all the connections and gives me an API which supports simple POST for message pushes (also supports AMQP and Redit)...
Long story short, ASIHTTPRequest was a perfect solution for me.
Try this simple library which is written in Swift:
https://github.com/hamin/eventsource.swift
The API is super simple. It uses NSURLConnection for now.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm trying to create an app like uber, and I'm having trouble with the iphone to iphone connection. How am I to send a request to another Iphone, saying I am your driver! Am I to have riders become accepted, and add them to some database of riders in which drivers can see them? Basically I just want a little explanation on the ways I can use swift to connect iphones, any help is appreciated.
Looks like you've got a decently long way to go, but let's break this down.
Despite how it may seem, phones don't usually talk directly to each other. In these circumstances, an app will contact a central server in order to get information about things around them. The phone (in your situation) would likely contact the server and request a list of nearby drivers and locations. The server would then send a feed of nearby drivers and their locations so that the phone could display the locations of the drivers.
When you request a ride, your phone will tell the server its current location, and potentially the targeted location. A lot of work is done behind the scenes in the server to schedule a driver to pick you up. The server is keeping track of where a given driver is, how many other clients he has in queue, how long the driver would take to get to you, among many other factors. Once it figures out which driver would be best able to serve you, it will contact that driver and tell him to start moving toward you.
Then the server will contact you saying that it has found a driver, and then will send you the feed as to where that driver is in his progress to get you.
So to more directly answer your question, you'll need to start by setting up a server up to do a lot of the work behind the scenes. You can write a server backend in Swift using Vapor, but server-side swift is in its infancy. I'd also recommend looking into Ruby on Rails using the Ruby programming language, or Node.js using Javascript. But none of these are trivial matters.
Given the nature of your question, the problem you're attempting to solve is certainly a lot more difficult than you've anticipated. But don't let that stop you from asking questions like these.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I'm in need of a open source solution/library to stream RTSP/RTMP to an iOS Application. I need to build an app that connects to a media server, and opens the provided video stream. I believe there has to be libraries out there, but I have yet to find one that is open source, compiles, actually works, and runs on iOS 5+, iPhone 4+. I do not have a preference, RTMP or RTSP will suffice. Preferably the one with the least amount of work. I have RTSP working on the Android side, but nothing for iOS yet.
This is what I already know from research today -
RTSP
Seems possible using Live555/FFMPEG
MooncatVenture Group - Old FFMPEG, not compatible with ARMv7s (No updates/blogs/commits in over a year)
DFURTSPPlayer - This is a working example.
RTMP
Seems possible using Live555/FFMPEG
A few libraries are out there for data messaging, but that is all
MidnightCoders Project - Does not seem video support is build yet, as Audio is not.
I've never messed with anything video related before, so encoding, frame rate, key frame, chunks, etc... is pretty foreign to me. Right now, it seems building a static binary from Live555/FFMPEG is the only solution to my problem. If so, can anyone give me a simple quickstart guide or links to a blog/example someone has out there? I'm not looking for anything crazy, just a simple
Download This - LINK
Compile it like this - LINK
Place it into X Folder in Xcode
Create X Object
Read Stream API here - LINK
If not, anyone want to point me to a working open source library?
Oh yeah, this happens to be my first iPhone app and first time in Objective-C. Awesome first project, yeah?
DFURTSPPlayer is a working example on GitHub. Will have to double check on licensing issues, but with this it is a good place to start for RTSP.
It seems at this time, the only way to do what I want is to create a static binary to use, from complete scratch. Libavcodec, FFMPEG, and Live555 are all under LGPL. Which means, in order to not make my code open source, I would have to allow dynamic linking, so that my app users have the ability to make modifications to the open source libraries I used, whenever they want. The AppStore does not allow dynamic linking, so I am essentially dead in the water unless I want to write it all from scratch. Which, I definitely do not want to do...
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Is there a logging framework for iOS that could aid developers in diagnosing app crashes?
You may like:
Lumberjack: stable and traditional
"It is similar in concept to other popular logging frameworks such as log4j, yet is designed specifically for Objective-C, and takes advantage of features such as multi-threading, grand central dispatch (if available), lockless atomic operations, and the dynamic nature of the Objective-C runtime."
LibComponentLogging: beautiful and hardcore, used by RestKit
"LibComponentLogging is a small logging library for Objective-C applications on Mac OS X and the iPhone OS which provides conditional logging based on log levels and log components. Additionally, different logging strategies can be used, e.g. writing log messages to a file or sending them to the system log, while using the same logging interface."
NSLogger: fancy with a dedicated visualization OS X App
"NSLogger is a high perfomance logging utility which displays traces emitted by client applications running on Mac OS X or iOS (iPhone OS). It replaces your usual NSLog()-based traces and provides powerful additions like display filtering, image and binary logging, traces buffering, timing information, etc."
I know this post is old but I'm looking for one as well. I found one called Lumberjack, though I haven't tried it yet.
I created a simple logging framework that might help. I'd appreciate any feedback you have. Hope it helps.
Link to Project Page
This previous question seems to overlap. But the bottom line is:
NSLog(#"message");
or:
printf("message");
I have a slightly different need: not only do I want to debug crashes, but I also need to debug other errors (NSError, NSException).
I tried all 3 packages mentioned in IlDan's answer. However, all of them require me to adopt a new way of logging, which may not be compatible with the libraries I depend on. For example, I intended to adopt NSLogger but RestKit, which is an important library in my project, uses LibComponentLogging.
So I ended up with writing a small pod (https://github.com/kennethjiang/Teleport-NSLog) for that. The mechanism is to redirect stderr (where NSLog + all these logging frameworks write messages to) to a backend HTTP server. Now I can debug my app running in user's device just as if it was running in my xcode. :)
For basic logging use NSLog(#"your message here")
If you want more flexible logging look into Lumberjack. It can let you disable logging in production etc etc.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
What are the HCI challenges of Web 2.0?
Here are a few more:
Clear privacy options
Facebook has repeatedly changed the way it deals with content ownership and privacy. (See here, here and here.) Aside from the obvious PR gaffes, this has also demonstrated the difficulty users have understanding privacy.
Geeks like us are familiar with ideas of inheritance and groups. Heck, many of us work explicitly with permission structures when dealing with files on *nix systems. To most users though, it's not clear who can see what or why.
Service Interoperability
On the desktop we're used to being able to chain together tools to get the outcome we want. A simple example would be dragging image thumbnails from a file explorer to an image editor. We'd expect that to work, but not on the web
The Flock browser goes some way to overcome this shortfall, as does the Google Docs web clipboard, but interaction between web services is still a long way off what we expect from the desktop.
Accessibility
Web 1.0 was primarily text based, so the main accessibility issues were easy to fix: stuff like text as images and tables for layout, which both affect screen-readers used by the blind.
As the content of the web gets richer (more images, video and audio), the chances get larger that someone will be excluded from it. Moreover, making video and audio accessible is much harder than making text or images accessible, so it's much less likely to be done.
Lastly, Web 2.0 introduced a whole new problem for accessibility: dynamic content. How should screen-readers (for example) deal with new content appearing on a page after an AJAX query? WAI-ARIA aims to address these issues, but they still require the web-designer to implement them.
Hope this was useful.
There are plenty as I see it,
Different screen resolutions.
Different hardware capabilities. (mobile; touch; desktop; laptop; soon orientation too.)
Localized content.
Location based.
With HTML5 upcoming, hardware acceleration; native api's; localstorage; offline.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I'm looking at designing an application to run on POS terminals in addition to the software already installed. I'd like it to receive POS printer commands and then on some of them intercept and modify them. So for example when a receipt is printed, we'd like to add a custom reference number in the middle of it without having to modify the third party POS applications.
I'd love to hear peoples suggestions on the best way to approach this as reading through the POS specs, it doesn't seem trivial.
I think the solution will be to do this outside of the POS app, but communicate on a level that it understands, print. A solution that looks like a printer to capture the data, reformat, and then send on would do the trick or something that interfaces with the OS (let's say Windows) at a printer port level.
In Windows we use a custom port monitor we created to caputre and route this data, it's something we use internally so I wouldn't suggest it for you as it has some bugs. A similar solution is RedMon. This could provide the solution or provide you ideas on how to accomplish it. Once the data is captured you launch a process against it.
An alternative is to it's over the network you can always setup something that montiors 9100 (RAW) or 515 (LPR) to intercept the data.
Lastly, if it's Windows and you don't want to create something as low-level as RedMon you can always use named pipes. You'd run a service application that would be montioring a named pipe. Your printer from the POS would have it's port set to 'local' and the port would be \.\pipe\. The would allow the application to driectly communicate with your service and thus launch a process.
You could have multiple named pipe/Redmon/network ports setup each with a unique assocated output to direct to the correct device on the other side.