I'm wondering if the following has ever been done before, ideally in unity.
What I want is to be able to take an image I have on my iPad and send it to a screen to be displayed with a flick gesture. Much like what you do with a window on a computer with dual monitors.
You drag it, and it instantly appears on the other monitor.
if this hasn't been done before, how would you go about making this possible? I know that it is going to require a fair deal of networking if I'm to pass an image from one device to another.
This depends strongly on what do you want to include under the designation of "monitor" in this particular context.
If it's the screen connected to a computer you can theoretically do all this by creating an IPad client application and a computer server application, and using some way to communicate between the two (either direct WiFi, or through internet, etc). Of course you'd need to make specific server applications for the Operating Systems you'd like to target (Windows, Linux, etc).
If the screen is a TV, you should check which are the "smart-enough" ones to allow for an external wireless connection (or maybe wired too if that's what you prefer) that an IPad can use. Then, for each of those TVs you'd have to check what APIs they provide, and if said APIs allow you to access the connectors in the way you want. Then it's the same story as before, chose which platforms you want to target and create the software for those.
One way you can explore this is with your tablet and computer (either any Android Tablet+Any Computer with any OS or IPad with Apple Computer). Try doing some simple coding, just to see how you can transfer an image from tablet to computer. Once that's done, the presentation part of your IPad application (gallery of images, finger flick support, etc) can be done with Unity.
Related
I wrote an iPad app to control a proprietary "robot" (based on a 3d-printer). With respect to providing a GUI to the robot, the iPad (mini) is ideal in a way, since it avoids additional I/O devices like e.g. a mouse or a keyboard and obviously comes with a display.
The app currently interfaces with the robot via WiFi (utilising WebSockets). While this basically works, every once in a while, latency seems to be an issue. At the same time, the WiFi interface to the robot seems to be a bit of overkill, since the iPad is physically mounted to the frame of the robot with a physical distance to its motherboard of less than 3 feet. - Also, the robot can be beautifully controlled from a serial terminal over UART (e.g. from macOS). - Therefore, I would like replace the WiFi/WebSocket communication with a simple serial interface.
Unfortunately, though, "/dev/tty.iap" does not seem to be accessible on iOS/iPadOS unless you enroll in Apple's MFI program, which - in this case - also appears as overkill, since our app will only ever be used internally. redpark.com makes (expensive MFI certified serial) cables and a library, which I consider a last resort ...
Jailbreaking the iPad might also be way to go. - However, I have no experience with this and would really like to avoid bricking the iPad. On the other hand, I have no basic objections against jailbreaking my device, which is exclusively used to control the robot.
Can anyone recommend a safe (e.g. jailbreak) route towards accessing "/dev/tty.iap" for in-house apps ? Would "/dev/tty.iap" be immediately accessible on a jailbroken iPad ? - Would the iPad have to be jailbroken again, everytime it is re-booted ? What would be a suitable Jailbreak tool for my purposes (candidates are iPad mini 5 with iPadOS 15.1 or iPad mini 2 with iOS 12.5.5) ? Are there any alternative routes towards accessing "/dev/tty.iap" for in-house apps (avoiding jailbreaking or MFI) ?
Thanks!
EDIT1 one more thought: There are numerous unpublished APIs providing access to normally "hidden" iOS-features. - While employing such APIs would disqualify any app for the AppStore, in "my" case, publishing the app is not even considered. Hence, if somebody is aware of unpublished functions that provide the desired access to "/dev/tty.iap", respective hints would also be appreciated very much.
We were exploring various test suites for mobile automated testing and ran into this company called Perfecto Mobile. One of the features that blew me away was they are able to (without jailbreaking) effectively perform a "Remote desktop" on a physical iPad.
So, the iPad's screen is mirrored within a web application, it can register touch / swipe events on the web app and perform them on the device. The only relevant technical detail I have is that all this is being performed using commands sent over the USB cable.
I'm really curious as to how this is implemented and details on relevant Private APIs if any.
Thanks,
Teja
I'm not familiar with PerfectoMobile, but I can give you a few pointers on how this can be accomplished:
For the mirroring, one way would be to look at using AirPlay, the APIs are pretty well documented, but not to do what we're talking about which would require some serious reverse engineering, but it's definitely possible, these guys have done it. A different approach would be to run a background app that would periodically take snapshots of the main screen, and send them over a socket connection to a client. You could do this as a VNC server, and to incorporate the remote view in a web app, you could use noVNC. As far using a USB connection, in the case of the background app talking to a client over TCP, you could to a port forward.
To actually perform on the device the touch events sent from your remote viewer, most people have been using the GSEvent group of functions from the GraphicsServices private framework without needing to jailbreak the device. Again, a background app would receive over a socket an instruction such as "Tap there", instantiate the GSEvent, and inject it so it gets processed in the run loop of the most front app.
These few possibilities, at least, have been implemented successfully in different iOS apps up to iOS 6.1 (iOS7 is a different animal). You won't find any such app in the App Store, since Apple clearly prohibits the use of private frameworks in 3rd party apps, instead people deploy them in-house using Enterprise and ad-hoc provisioning profile. On Android however, there's VMLite available in the Play Store.
If you looking to share screen from ios / android, check out skreen.me. They have sample apps you can try out, also they provide libs for mobile app integration.
I'm currently developing a mobile website with jquery mobile, not exactly responsive web design. I know I can develop the project in the browser on my desktop PC with some plugins or use some online services or simulators available. But I'm not sure if I missing something really important for test.
Example:
touch/swype events or viewport rotation.
Is necessary purchase some physical devices (smartphone/tablets) to develop/test the project? Why?
Intro
First don't let anyone tell you it is not necessary to purchase a real devices for a test purpose. I will tell you why from an Android perspective, same thing, just in a much smaller manner also goes for iOS development.
Good sides of an emulator testing
It is free, you only need a computer which will run your emulator.
You can test your applications in different cases (different screen resolutions, different OS versions)
Faster I/O and network operations but this is not so much a good point if you calculate how much everything else is slow.
Bad sides
It is slooooooooow, if you never tried to use it you can comprehend how slow it is (iOS emulator is fast like hell in a comparison). It doesn't matter if you have a top of a line hardware PC or Mac it is just that slow.
Emulator is simply to darn buggy, there will be a lot of times when application will work just fine on a read device and it will brake on an emulator.
This also goes other way around, sometimes application will work just fine on an emulator but will brake on a real device, in some case it will not work at all or it will not work on some devices. This is usually a case when working with hybrid applications. for some reason Android web view acts differently on real devices and on an emulator.
Emulator simply don't have some functionalities to interact with a hardware nor it can successfully emulate it. Hardware connection it can emulate even don't work correctly sometimes.
I have talked about how slow it is (because of a converting ARM bytecodes to x86 ones on the fly) but from a graphics standpoint it tends to be even slower so don't expect to do any game development on it.
Real devices comes with much more preinstalled software which may slower your application or in some ways enhance its functionalities.
Real world GPS testing is out of a question
Final notes
If you are intending to work with iOS only emulator can be used to do much of a development. Sheer lack of different screens sizes and hardware diversity makes it a perfect platform for a test purposes. Android on the other hand is completely different story, its emulator is simply useless for test purposes. I have several real Android devices, ranging from Android 2.1 + , different screen sizes and finally hardware architecture. You don't need to believe me but everything I mentioned play a significant role while testing Android applications.
If your main concern is testing your jQuery Mobile application I would still advise you a use of a real device in case of Android while in case of iOS you can successfully use emulator. Android is problematic because transition effects are to darn slow and that includes everything else that requires animations. Swipe will not be a problem and I can vouch it works just fine. Second real problem is a device rotation. jQuery Mobile sometimes can have a problem with it, mostly when used with non responsive 3rd party jQuery plugins (carousels, sliders ....). Third problem is mentioned in my list of bad sides, basically web view used in a emulator acts different then one in a real phone so sometimes you will see one thing in your real device and one thing in your emulator.
It is not necessary to purchase such a device.
For Android there is an emulator provided by the Android Development Kit (ADK). You can use it to configure multiple emulated devices with defferent screen sizes, etc. to test for multiple resolutions and Android (browser) versions.
[edit] Though to really test it for iphones you would need that emulator too I suppose, to make sure the website is correctly displayed in the provided browser.
[edit 2] To test "real" smartphone apps (not webapps), it is better to have a real device at hand.
This very much depends to which level you want to test it before you are happy to hand it over for the usage. After you do that and someone reports a defect, will you be able to see where is the problem (if it works on your PC)?
The development itself can be done in your browser, you can even simluate swipe events by dragging mouse. You don't even need any simulators, you can just make chrome window smaller (most of the devices are using some version of webkit, same as chrome).
However, once it comes to testing, I wouldn't feel great if I didn't know how it looks on the device itself. So I think having at least one device (ideally two with different OS and resolution) is always beneficial.
I would also be unsatisfied if I was working on something of which I could not see the result :)
I have an idea for an app that I'd like to develop, but before I invest a lot of time learning objective C and the iOS APIs, I'd like to make sure that what I want to do is feasible.
The app I want to make is a purely auditory (sound-only) version of Google Glass. I'm visually impaired, so spending a lot of money on something visual, even though it can read content to you, would not be worth it. But if I could use an iPhone to give many of the same options as Google Glass, that would be great.
Many times, I've wanted a piece of information while walking down the street, where I couldn't easily get to my iPhone, because I have my cane in one hand, and something else in the other. In such cases, it'd be awesome if I could say a command, and get a voice response.
I'd use the microphone built into the Apple earphones for audio input, but I'm not sure if it's possible to listen for audio input while the screen is locked. I'm certain it's not possible with a non-jailbroken iPhone.
Can anyone can tell me if this is possible?
Yes, you can do this.
In order to keep your app running all the time, even when the iPhone is locked, you could build a Launch Daemon. A launch daemon can start when the phone does, and is not subject to the restrictions that iOS puts on sandboxed apps, installed to /var/mobile/Applications/.
You do need to have a jailbroken device to take advantage of Launch Daemons. Here is a good tutorial on building one.
Launch Daemons are also a normal part of OS X, so if you need more information, you might try consulting the OS X docs online. Most aspects of Launch Daemons work the same way on a jailbroken iPhone.
You'll also want to be able to detect certain events, to activate your app. You certainly don't want to be processing an audio stream at all times (maybe you only activate the app when you start walking with your cane). To detect events, like a home button press (or however you want to activate your code), I would take a look at RPetrich's libactivator library.
We have recently developed an iPad application and now need to start demonstrating it to customers and prospects as part of our overall product suite during webinars. As part of our Agile methodology, we also need to periodically review the application with key customers without having to distribute it since the application is not a standalone application and requires a connection to web services installed at each customer site.
We have searched high and low for any solution that doesn't involve rooting the device but have been unable to find one. The most common suggestion seems to be to point a webcam at the device, but that comes across as very unprofessional.
I know that there are VGA out adapters that can be plugged into the iPad and we have used these to present through a projector when the customer is physically present, but this is a relatively rare occurrence. Perhaps there are solutions that we are unaware of that can be used to send VGA output back into a desktop device for screen sharing?
Put a Slingbox on your LAN and connect the iPad video to the Slingbox video input. Then use a web browser on your computer to view the Slingbox feed and share your screen with WebEx as usual.
EDIT - BTW, there are other gadgets besides the Slingbox to get composite video into a computer such as Elgato Video Capture to name one.
A better option would be to use http://www.reflectionapp.com/
(Im not affiliated with the company).
I use this app.
http://www.airserverapp.com
It can be used with Windows or Mac.
Easy to use -- the only trouble I have is I keep poking my PC instead of the iPad.