We have recently developed an iPad application and now need to start demonstrating it to customers and prospects as part of our overall product suite during webinars. As part of our Agile methodology, we also need to periodically review the application with key customers without having to distribute it since the application is not a standalone application and requires a connection to web services installed at each customer site.
We have searched high and low for any solution that doesn't involve rooting the device but have been unable to find one. The most common suggestion seems to be to point a webcam at the device, but that comes across as very unprofessional.
I know that there are VGA out adapters that can be plugged into the iPad and we have used these to present through a projector when the customer is physically present, but this is a relatively rare occurrence. Perhaps there are solutions that we are unaware of that can be used to send VGA output back into a desktop device for screen sharing?
Put a Slingbox on your LAN and connect the iPad video to the Slingbox video input. Then use a web browser on your computer to view the Slingbox feed and share your screen with WebEx as usual.
EDIT - BTW, there are other gadgets besides the Slingbox to get composite video into a computer such as Elgato Video Capture to name one.
A better option would be to use http://www.reflectionapp.com/
(Im not affiliated with the company).
I use this app.
http://www.airserverapp.com
It can be used with Windows or Mac.
Easy to use -- the only trouble I have is I keep poking my PC instead of the iPad.
Related
We were exploring various test suites for mobile automated testing and ran into this company called Perfecto Mobile. One of the features that blew me away was they are able to (without jailbreaking) effectively perform a "Remote desktop" on a physical iPad.
So, the iPad's screen is mirrored within a web application, it can register touch / swipe events on the web app and perform them on the device. The only relevant technical detail I have is that all this is being performed using commands sent over the USB cable.
I'm really curious as to how this is implemented and details on relevant Private APIs if any.
Thanks,
Teja
I'm not familiar with PerfectoMobile, but I can give you a few pointers on how this can be accomplished:
For the mirroring, one way would be to look at using AirPlay, the APIs are pretty well documented, but not to do what we're talking about which would require some serious reverse engineering, but it's definitely possible, these guys have done it. A different approach would be to run a background app that would periodically take snapshots of the main screen, and send them over a socket connection to a client. You could do this as a VNC server, and to incorporate the remote view in a web app, you could use noVNC. As far using a USB connection, in the case of the background app talking to a client over TCP, you could to a port forward.
To actually perform on the device the touch events sent from your remote viewer, most people have been using the GSEvent group of functions from the GraphicsServices private framework without needing to jailbreak the device. Again, a background app would receive over a socket an instruction such as "Tap there", instantiate the GSEvent, and inject it so it gets processed in the run loop of the most front app.
These few possibilities, at least, have been implemented successfully in different iOS apps up to iOS 6.1 (iOS7 is a different animal). You won't find any such app in the App Store, since Apple clearly prohibits the use of private frameworks in 3rd party apps, instead people deploy them in-house using Enterprise and ad-hoc provisioning profile. On Android however, there's VMLite available in the Play Store.
If you looking to share screen from ios / android, check out skreen.me. They have sample apps you can try out, also they provide libs for mobile app integration.
I'm wondering if the following has ever been done before, ideally in unity.
What I want is to be able to take an image I have on my iPad and send it to a screen to be displayed with a flick gesture. Much like what you do with a window on a computer with dual monitors.
You drag it, and it instantly appears on the other monitor.
if this hasn't been done before, how would you go about making this possible? I know that it is going to require a fair deal of networking if I'm to pass an image from one device to another.
This depends strongly on what do you want to include under the designation of "monitor" in this particular context.
If it's the screen connected to a computer you can theoretically do all this by creating an IPad client application and a computer server application, and using some way to communicate between the two (either direct WiFi, or through internet, etc). Of course you'd need to make specific server applications for the Operating Systems you'd like to target (Windows, Linux, etc).
If the screen is a TV, you should check which are the "smart-enough" ones to allow for an external wireless connection (or maybe wired too if that's what you prefer) that an IPad can use. Then, for each of those TVs you'd have to check what APIs they provide, and if said APIs allow you to access the connectors in the way you want. Then it's the same story as before, chose which platforms you want to target and create the software for those.
One way you can explore this is with your tablet and computer (either any Android Tablet+Any Computer with any OS or IPad with Apple Computer). Try doing some simple coding, just to see how you can transfer an image from tablet to computer. Once that's done, the presentation part of your IPad application (gallery of images, finger flick support, etc) can be done with Unity.
We are in the process of creating a web app that has print capabilities, and would like to be able to support this functionality on as many devices as possible. Our users are specifically using iPads, but we will eventually want to support other devices. I've seen that iOS now uses AirPrint, but what about printers that don't support this? Is there any way to cover that from a web app standpoint or are other measures necessary?
Check out this App: https://itunes.apple.com/de/app/printer-pro/id393313223?mt=8
If it's an option you could use it, or come up with something on your own, but it is possible.
I'm working on my senior engineering design project and I need your help! For this I have my iPhone app receiving images from a external camera circuit, which I built.
To interface my iPhone app to the camera circuit, I have looked into the following approaches:
Build a bluetooth module on the camera circuit, to transfer images to the iPhone
Use Eye-Fi SD card to transfer images to my app somehow! link:http://www.eye.fi/products/iphone
Build a circuit, to make a wired connection to the iPhone with the 30-Pin dock connector
Here are the problems I'm facing with each of these. My actual questions for you guys are highlighted in BOLD:
The iOS BlueTooth framework (4S only), only supports Low Energy Devices. Looking at the the modules out there like this one, I'm doubting it will work for image transfer, which seems to be a bulky task for low energy bluetooth. I know there are jailbreak apps on the cydia store, which do regular bluetooth transfers, but I was unable to find those private APIs for such a task. (NOTE: I'm making this app for my purposes, so feel free to suggest any private/unofficial APIs). Question#1: How can I interface to a regular bluetooth device (not another iPhone) and transfer data?
EYE-FI card sounded amazing as a consumer because the company has their proprietary iPhone app to transfer the images from the EYE-FI SD card. Problem is I can't figure out how to easily interface with the EYE-Fi card in my code. I researched the iOS CFNetwork framework, but haven't had any luck. Question#2:How can I interface with the EYE-FI card in my app?
Building a circuit seems simple enough with this development board, but I read somewhere that the iPhone may not recognize an "un-registered" accessory. I have a developer license but not a MFi licence. Question#3: Do I need to be registered as a MFi developer to create and use this external accessory in my App for my own purposes???
You might try setting something up through a serial port since joining the MPi program is prohibited for individuals. You could possible use a connector like this one http://www.amazon.com/neXplug-Ultra-Small-Micro-Adapter/dp/B0055PCVDO/ref=sr_1_1?ie=UTF8&qid=1339309918&sr=8-1
The Apple website recommends individuals/hobbyists to use " recommend that you use a third-party solution which will allow you to connect iOS devices to serial devices and to write iOS apps that communicate with these serial devices" (from mfi.apple.com/faq).
I am also working on an external camera that can hook to the iphone/ipad. I will be using a serial port in order to get around the MFi requirement for external iphone/pad devices. Trying to use bluetooth is too complicated and the data stream isn't big enough for pictures. the wired version will work much better.
I hope this helps and that your college term and project are not already finished. Best of luck.
As T Reddy has already mentioned, if you want to create hardware the interfaces with external hardware framework, you have to sign up with the Apple MFi program which you, as an individual, can not do.
I'm not sure of how the Eye-Fi system works but it sounds to me that it basically syncs the images to their server and once you download their Apple App, the app can sync the photos for you.
Whether you are using Bluetooth or the 30-pin connector, there is no way to interface to an external device unless that device is MFi compliant and a part of the MFi program. I suggest you try the following options to solve this delimma--
If this is a "Senior Project" at some University, see if your University is part of MFi. Apple will not let individuals join the program, so if you are going to gain access, you have to access it through another organization or, possibly, an educational institution. I don't know if Apple has worked with schools in this regard, but you never know. It might be possible.
If your school isn't in the MFi program then you may want to consider re-writing your application for an Android device. Android devices are not locked down like iOS devices, so that may be a more reasonable approach.
I hate to bring bad news but circumventing these hardware restrictions on an iOS device is excessively prohibited. Your options are quite limited and none of them are probably what you either want or need to hear.
We are developing a game application for Blackberry, which will have multi player option to let two or more player compete against each other. We have implemented the logic for that and already two or more player are able to play the game simultaneously on single device.
Now we want to upgrade our application so that two or more user can compete each other playing from different device and different locations.
Can anyone please help me by providing me the way (or code) to achieve this (communicating two blackberry devices over the air to access single session programmaticaly)?
I found the code to connect two device using Bluetooth but as the player may situated in different geographic regions we need to achieve this using over the air connection.
what about pin messaging. after making move of one player send pin messages about his move to other players than read received pin messages through your code using folder listener and change the UI.
pin messaging is free
but both handset should be blackberry.
i have read your article Really Nice one.because Now a days not only big companies like Adobe, Microsoft are doing Blackberry App Development in India at their development center. But now many other small to medium scale companies have started Blackberry Software Development, ipad html5 development and started entering into iPad Web Development. It is considered that, increase in business will be around 100% in Wireless Application Development.