Has anyone tried to develop an app for the Double Robotics Telepresence Robot? I downloaded the SDK from https://github.com/doublerobotics/Basic-Control-SDK-iOS and deployed the sample app to my iPad. I connected the iPad to my robot via bluetooth but was unable to operate the robot via the app. How do I go about developing an app for the robot?
If you look at the readme on the Github page, it has some example code for a basic app. You just need to add some controls to the UI and connect up the IBActions and IBOutlets. I created my own app with a subset of the features - forward, back, left and right arrows; park and unpark buttons (for deploying/retracting the kickstand).
One thing I haven't been able to get working is the travel data.
There is also an example app in the repository, but it appears to be targeted at iOS 6, so it may take some extra effort to get it working.
Related
For 'check-in' purposes, my staffs have a mobile device (ios and android) that opens my web app through a browser. They then open a page and it allows them to scan a customer's qr code to check them in. I would like it to have as little user interaction as possible. All they staff needs to do is to keep the webapp open, and scan the qr code. The web app will then call an api on my server which check this customer in. So accessing the camera's view is best, i can then run a qr code scanner.
I've been able to do it for android (using getUserMedia) but it doesnt work for ios. (duhhh)
I'm currently using Vue.js(V1), and would like to keep it as it is.
Hybrid Apps: I've looked at OnsenUI(which seems to only work with vue2), Ionic(which doesnt allow me to build/run ios platform as i am using windows).
webapp is coded with vue.js(v1), running on a tomcat7 server, HTML, JS.
Are there any suggestions?
I've found a way to use Phonegap to allow this.
Edit: Phonegap allows you to create a hybrid app that seems like a native app(ios or android). All i did was install phonegap and used one of their plugins (barcodescanner). This is the refrence i used.
But, from my understanding, You'll need to publish the app on the app store, which is a lot of hassle (and cost money).
I would like to create a prototype to show an app concept. It would be a voice-based cooking assistant, utilizing voice recognition.
User would speak to iPhone/iPad, and navigate through a recipe which would be spoken back to him. This prototype will be shown at student exhibition – there will be space to hide a mac, and possibility of connecting the iphone by cable to it as well.
I have no experience in XCode, I have some experience in programming and coding. I made some prototypes in flinto/invision, but I guess that for this project, native programming is a must - or am I wrong?
How hardcore would I need to go to be able to make this work?
Is there any way to fake it – for example use the voice detection in OS X/Web Speech Api in Chrome and make a webapp which would be streamed to iphone/ipad as to a secondary monitor?
Thank you for all suggestions.
You can have a "script", and follow it, you don't have to develop a functional app, just something able to show the concept.
There is a WWDC video to show some concepts of prototyping, highly recommended.
https://developer.apple.com/videos/play/wwdc2014-223/
We were exploring various test suites for mobile automated testing and ran into this company called Perfecto Mobile. One of the features that blew me away was they are able to (without jailbreaking) effectively perform a "Remote desktop" on a physical iPad.
So, the iPad's screen is mirrored within a web application, it can register touch / swipe events on the web app and perform them on the device. The only relevant technical detail I have is that all this is being performed using commands sent over the USB cable.
I'm really curious as to how this is implemented and details on relevant Private APIs if any.
Thanks,
Teja
I'm not familiar with PerfectoMobile, but I can give you a few pointers on how this can be accomplished:
For the mirroring, one way would be to look at using AirPlay, the APIs are pretty well documented, but not to do what we're talking about which would require some serious reverse engineering, but it's definitely possible, these guys have done it. A different approach would be to run a background app that would periodically take snapshots of the main screen, and send them over a socket connection to a client. You could do this as a VNC server, and to incorporate the remote view in a web app, you could use noVNC. As far using a USB connection, in the case of the background app talking to a client over TCP, you could to a port forward.
To actually perform on the device the touch events sent from your remote viewer, most people have been using the GSEvent group of functions from the GraphicsServices private framework without needing to jailbreak the device. Again, a background app would receive over a socket an instruction such as "Tap there", instantiate the GSEvent, and inject it so it gets processed in the run loop of the most front app.
These few possibilities, at least, have been implemented successfully in different iOS apps up to iOS 6.1 (iOS7 is a different animal). You won't find any such app in the App Store, since Apple clearly prohibits the use of private frameworks in 3rd party apps, instead people deploy them in-house using Enterprise and ad-hoc provisioning profile. On Android however, there's VMLite available in the Play Store.
If you looking to share screen from ios / android, check out skreen.me. They have sample apps you can try out, also they provide libs for mobile app integration.
We have recently developed an iPad application and now need to start demonstrating it to customers and prospects as part of our overall product suite during webinars. As part of our Agile methodology, we also need to periodically review the application with key customers without having to distribute it since the application is not a standalone application and requires a connection to web services installed at each customer site.
We have searched high and low for any solution that doesn't involve rooting the device but have been unable to find one. The most common suggestion seems to be to point a webcam at the device, but that comes across as very unprofessional.
I know that there are VGA out adapters that can be plugged into the iPad and we have used these to present through a projector when the customer is physically present, but this is a relatively rare occurrence. Perhaps there are solutions that we are unaware of that can be used to send VGA output back into a desktop device for screen sharing?
Put a Slingbox on your LAN and connect the iPad video to the Slingbox video input. Then use a web browser on your computer to view the Slingbox feed and share your screen with WebEx as usual.
EDIT - BTW, there are other gadgets besides the Slingbox to get composite video into a computer such as Elgato Video Capture to name one.
A better option would be to use http://www.reflectionapp.com/
(Im not affiliated with the company).
I use this app.
http://www.airserverapp.com
It can be used with Windows or Mac.
Easy to use -- the only trouble I have is I keep poking my PC instead of the iPad.
I'm trying to find out whether Appcelerator's Titanium is good for iPad app development, or if it can even be used as such?
There seemed to be an announcement in April 2010 of a 'Titanium Tablet' package, although I can find no further mention of this. From the forums (where I've also asked this question, but have yet to receive any responses) it sounds like people are developing iPad apps, but I've yet to receive a definitive answer.
Any help would be greatly appreciated!
Cheers,
Toby
Yes, you can use Titanium to develop iPad apps. There are a few iPad only UI elements in the API: here
Yup, Titanium works for iPad apps. Like Dave said above, there are some (two to be exact, as of today) iPad specific API's, other than those you would just use all the other existing APIs and when you create your project in Titanium Developer, you would choose iPad as the project type.
The API documentation (linked to the iPad specific APIs but the list on the left has all available ones for Mobile Dev):
http://developer.appcelerator.com/apidoc/mobile/latest/Titanium.UI.iPad-module
To get an idea of what you can do, take a look at the Kitchen Sink Apps. There are two, one for the iPad, which is basic and really only shows the iPad specific API usage for the most part. The other is for the iPhone which has nearly every available API on display. They are ugly apps but they do the job of showing you what's possible and give some decent reference code. You can find the kitchen sink apps on Git hub (link below) and go here (http://developer.appcelerator.com/doc/kitchensink) to see how to run them in them the simulator (requires xCode which is free from Apple).
Also, you can find some showcased iPad apps built with titanium on the Appcelerator website (link Below), just search the page for the word iPad.
Kitchen Sink Apps : github.com/appcelerator/KitchenSink
Appcelerator Showcase : appcelerator.com/showcase/applications-showcase/
(sorry for the non linked URLs but Github will not let you post more than one link unless you have more than 10 reputation points)
Currently you can build iPod, iPad, Android. BlackBerry is also available if your a paying member.
Source is also on Github if you want to have a look first.
http://github.com/appcelerator/titanium_mobile