I started UE4 and Oculus Go development 2 days ago. I saw youtube videos where the people were able to instantly connect the device to UE4.
I followed full documentation. Turned on developer mode, Installed Android SDK, enabled the plugin and every small step in the docs.
Now when I connect Oculus go via USB it is shown in Launch options with the name 'Pacific(something)' and I deployed the starter VR project on but nothing was moving. How to control it on Oculus?
Second thing, the VR preview remains grayed out no matter what I do.
UE4 version - 4.20.3
Machine - Windows (i5, 8GB)
Thanks.
Related
We currently have a Xamarin.Forms application that requires a Mac to build the Xamarin.iOS project. We purchased two MacMinis to do this but these are located in our office and due to Coronavirus we are all working from home for the foreseeable future. We are trying to use Microsoft's wireless deployment feature (https://learn.microsoft.com/en-us/xamarin/ios/deploy-test/wireless-deployment?tabs=windows) - this works if I'm in the office on the same network or I take the MacMini home because all hardware (laptop with VS19, MacMini and iPhone) are on the same network. However, my laptop does not detect the phone when the MacMini is in the office and my laptop plus iPhone are at home on the VPN. I'm going to speak to my companies IT department to see if it is any settings on our VPN side but any suggestions or assistance would be much appreciated!
What I've Tried:
I've tried connecting the iPhone to the VPN and wirelessly deploy it from Visual Studio 19 but my
device is not detected.
I've looked for an answer through Stackoverflow's "Similar questions".
A quick Google search didn't retrieve anything relevant to wireless debugging/deployment using a VPN.
I've searched on Microsoft Community to see if any issue had been raised.
I am a teacher who has developed an app that streams video tutorials to mobile devices. The app has been around for a while and was originally built in CS6 using AIR SDK 20. More recently I have moved to CC and published the app using CC and AIR SDK 23 and 24 (including beta 24.0.180).
The app now crashes on IOS first access after installing the app. When you open the app up again, it opens but will crash when pulling up the control centre, pulling down the notification bar, clicking on the home button or minimising the app with a pinch.
I have stripped out the code and have isolated the problem to video streaming within the project. The issue is replicated using the xml and swf file found here: Dropbox -
https://www.dropbox.com/sh/h4rht8ksps2k1gg/AADdlFLst5CALJgl6oUsziIUa?dl=0
(everything has been stripped from the project so I appreciate that no video will play - didn't need it to play to replicate the issue). The app has been installed on iPad Air with IOS10 and iPad Mini 1 IOS9, but my students have also experienced the issue on iPhones running IOS7 +.
Any help with this would be much appreciated as my students would like to use the app without the issues.
Kind regards
Matt
Do you plan to publish an SDK for the Aplha and NEX cameras? You publish some apps yourself and it would be good to see what the developer community out there could do with these devices.
In particular I would like to see a Studio app that caused the OLED viewfinder to show a well exposed image regardless of manual camera settings. That would allow me to use the A6000, A7R and the like in a studio with high power studio strobes.
Many thanks
Nick SS
The original poster is/was asking Sony how to write/develop embedded applications for Sony cameras that support after-the-fact install of "apps" from the Sony "PlayMemories" application ( aka "store"). There is also a kinda related ( but technically very different ) SDK by Sony that is called the "remote api", which is used for apps outside the camera ( android and ios phones typically) to assist them in remotely performing actions to the camera over WiFi. One is an in-camera app, the other is an out-of-camera app.
The short answer is that there has been some reverse-engineering to discover that these camera/s aparently run some variant of android itself AND someone has figured out how to duplicate the process of installing your own "app" to the camera. see here: https://github.com/ma1co/Sony-PMCA-RE
Here is a link to Sony's SDK site. Its still in Beta and is only apk format.
https://developer.sony.com/downloads/camera-file/sony-camera-remote-api-beta-sdk/
I looked over the documentation from the Sony SDK site and added the rest of the functions for the A6000 camera. You can find it on git hub. Still a bit of a work in progress and I have not yet tested all the functions. This repo will generate a jar file that you can use in any apk applications.
https://github.com/keyserSoze42/SonySDK
Here is the repo to the apk that Sony built. I parsed out the sdk part and this uses the sdk built from the other repo. You should be able to find it in the libs directory.
github.com/keyserSoze42/SonySampleApp
Currently only Camera Remote API(beta) is published and constantly being updated with new capabilities and new devices.
Has anyone tried to develop an app for the Double Robotics Telepresence Robot? I downloaded the SDK from https://github.com/doublerobotics/Basic-Control-SDK-iOS and deployed the sample app to my iPad. I connected the iPad to my robot via bluetooth but was unable to operate the robot via the app. How do I go about developing an app for the robot?
If you look at the readme on the Github page, it has some example code for a basic app. You just need to add some controls to the UI and connect up the IBActions and IBOutlets. I created my own app with a subset of the features - forward, back, left and right arrows; park and unpark buttons (for deploying/retracting the kickstand).
One thing I haven't been able to get working is the travel data.
There is also an example app in the repository, but it appears to be targeted at iOS 6, so it may take some extra effort to get it working.
I read an article recently which states that web apps on iOS launched from the home screen running in full screen mode have slower performance than webapps running inside safari.
Then I found a followup article to it which seems to sugguest that the issue above is fixed.
Does anyone know if this is confirmed?
According to information from appleinsider, ios5 beta fixes that problem and now Nitro JavaScript engine enabled on Web.app.
I've iOS 5 installed on my iphone4 and updated SunSpider JavaScript testing framework 0.9.1 (to be able to start as fullscreen web application under ios). And I've started subspider several times in fullscreen webapp mode and in Safari mobile. So, see my results below (images are clickable):
May be something was fixed (apple insider provides 4 vs 10 seconds difference), but I can't say that performance is equals in both cases (3756.5ms vs 5243.8ms for those who can't see images).
UPD
Small interesting note about UIWebView, it is impossible to use Nitro-enabled JavaScript engine in native applications (I mean applications designed in Xcode and posted to AppStore) because Nitro JIT requires to be able to use dynamic code signing.
UPD
Look at iOS 5 Top 10 Browser Performance Changes on blaze.io, Seems Apple enabled Nitro for Fullscreen WebApps in iOS5 (nice statistics in article).