I'm not too sure if I'm on the right site but I'd appreciate if somehow could point me out to the right direction as I am lost where to start.
So basically, I wanted to create a mobile app that runs on the background that could detect words in a video then do some logic after detecting it. The logic is as follows:
After detecting a certain word, it'll do a sequence of taps on mobile screen and post to an api.
Direction, links, search terms, or anything to get me started developing it would be greatly appreciated.
Related
I'm new to the iOS development world, so I'm starting off with a game that's neither Clash of Clans or the infamous 'I'm Rich'. I know Python and HTML/CSS so I will be able to cope with computer terms, but it would be great if you would be able to explain any answers.
A pattern flashes on the screen for half a second, let's say it is 'arrow left, arrow right, arrow left' (in picture form). Then the user has to replicate the pattern (in the right order) in order to boost forward. Basically, I'm asking what would be the simplest/most effective way of programming this? It would be great if I could easily add more patterns.
One way I thought that it could be done is to have an ID for each pattern, in which there are pre-defined 'nodes' that have to be swiped in a particular order. One benefit to this idea is that I can add patterns, however it could take a while to program them. Another idea I had is to assign an 'eraser' to the user's brush and it will detect when +-95% of the pattern is swiped. However, this would allow the user to swipe in any order.
Perhaps there is something super easy that I can do, but at the moment I have no idea. It would be great if I could do this in Apple Swift, however Objective-C is not a problem.
Thank you in advance,
Will
You basically answered your own question in the title. Take a look at the UISwipeGestureRecognizer documentation. From there you'd just want to push gestures into an array and then compare to existing pattern arrays
This is more of a theoretical question. I'm fairly new to iOS programming so haven't nailed fully the terms to use in scenarios like this. I've been asked to create an application where I need to fire an event to capture an image programmatically. The application will be in the foreground, and will have the requisite permissions to use the camera. I've been unlucky with my searches so far - likely because I haven't termed them correctly.
My question is this; Given an application with a camera view, set up to capture images - is it possible for me to fire an event within said app to capture the image, without a button necessarily being pressed? If so, how would I go about finding documentation to guide me through?
In an ideal world, I'm hoping for something as simple as cameraObject.capture() or something of that ilk - but an ideal world is a rarity!
Thanks in advance!
Maybe I'm just searching for the wrong term, but I've been able to find very little information on this subject, and I think it could be a problem for my app.
A while back, there was an article on the accuracy of the touch screens on iOS devices, and it seemed quite poor compared to other phones. Here is a link a posting about it:
http://forums.macrumors.com/showthread.php?t=1660713
Anyway, many of the commenters referred to "perspective compensation" as a cause for the inaccuracy. Basically, they are saying that iOS intentionally registers touches above the actual point of contact to compensate for the typical viewing angle of the user or for the angle of their finger or something like that. I have found that there is some credibility to that claim myself by doing as one of the commenters suggested and trying to use my iPhone upside down. I did find that it was difficult to touch things in some cases, and I have also noticed this problem in one of the apps I'm developing.
So, in case you want to skip all that rambling above, here is why it's a problem for me:
I am developing an app that is intended to be used by two people at the same time. The iPhone or iPad is placed on a surface between two people who are sitting across from one another, and they are instructed to quickly and accurately touch items on their respective halves of the screen competitively. What the article's comments made me suspect might happen, and what I have also found in practice is that the person using the phone upside down will have trouble touching buttons and dots on their first try. I've also tested slowly with a stylus and found that the touchable area of a button does indeed extend below a button, or above the button for the person using the phone upside down, hence the discrepancy and problem/disadvantage for that person.
So finally, if you want to skip that also, here is my question: Can "perspective compensation"(if that's what it's called) be disabled programmatically, and can it be done for specific views of an app? Have any of you noticed this and dealt with it in an app of yours?
While I have found that "perspective compensation" does seem to be occurring, I have not found any official documentation of it, and therefore have no idea how or if it can be disabled. When I search for "perspective compensation," the only results I find are links to the same article and comments.
I can't help but expect that this may have been asked before or is solvable with a simple checkbox, but perhaps for lack of the correct term to use, I have been unable to find any leads.
Thanks in advance for any of your solutions or suggestions!
This can't be done with the current SDK. All we have access to is the touch location, which is at a single point. Other search terms you might try are "digitizer" or "raw touch data", but there is definitely no check box or simple option.
To implement this, you will have to compensate for the touch location yourself. You'll need to play around with a compensating offset value for the upside-down buttons. Hit testing on views is probably the best place to do this, then your buttons can just respond to events as normal.
I want to know if is possible simulate the little movement of icons in iPhone springboard when I do a long press in one of them. Can you help me?
LOL!
I was at a conference last year and one of the talkers was an ex Apple employee around at the time of creating that icon wobble.
When they were creating it they used a combination of animation of scale, rotation and translation (both up, down and left, right).
When showing it to Steve Jobs he couldn't be satisfied by any of the wobbles that he was shown.
In the end they created some custom sliders (behind the home screen) that he was able to access so that he could customise the animation himself and get it "just right".
I know this doesn't help at all but thought it might be interesting.
Anyway, it looks like the link that Robotic Cat provided in the comments might give you something worth looking at.
I am implementing camera application using then example comes with blackberry plugin for eclipse named "CameraDemo" the problem is that when the screen loses focus It does not display the camera view istead of it shows like this
has anybody faced such problem whats the solution?
This way of taking picture (using the Player and VideoControl.getSnapshot()) does not work nice on all BB models. I'd even say it works nice only on a narrow set of BB models. So if you are going to use your app on a wide range of BB models, then this is not the right way to go.
Instead to take a picture use a built-in Camera app. Here is a starting point on how to do that.
Basically you invoke the built-in Camera app and listen for the file-system changes to detect a new image file path. Then you need to close the built-in Camera app somehow - it's possible to do that by simulating two 'Esc' button presses.
Yes, this sounds a bit hacky/over-complicated, but that's how BB engeneers arranged that for us. :) BTW, this is actually not so bad if compare with Android where different device manufactorers violate the common rules and implement the Camera app in their specific way so you are not able to write the code once covering all Androids.