I realize scanning bar code with ios abd zbar:ZBarReaderViewController.but when the camera show, the focusing very blurry,always need two second focus to become clear,why this happen?is there some setting that can make it better?
You can play around with flash/torch settings to enable better focusing. Bad/Low light is a big enemy of auto-focus. Setting focus setting to continuous will help with scanning.
Keep in mind the minimum focus distance of the camera, at a certain point it won't focus.
Related
I’ve been developing an iOS app using Swift 4 but recently (today) decided to switch to Flutter/dart after hearing about its capabilities.
In my iOS app, I have a moving background of waves (actual waves when you think of an ocean).
The width of the the wave .png is 1606 units and I animated it in a way where it will translate from right to left 265 units in 1 second, then it repeats itself. This way, the waves are moving continuously when in reality, it’s only a fraction of the entire png repeating itself.
I needed this same background to apply to all ViewControllers (screens) in the app and I did this by sending the last known x value right before the transition through a segue (transition between viewcontrollers, I believe it’s a “route” for dart?) and used this value as the initial position of the waves in the next ViewController. When I swipe up or down, to move to different view controllers, the waves would also move accordingly.
For some reason, the animations were a bit choppy but I’d say 80% of the time, it was perfectly smooth. I need it to be 100% for when I release my app tho.
How would I go about accomplishing this type of animation in Dart?
For animations, Flare seems very promising and I’m kind of steering towards using that to accomplish my goal but I’d like to hear any advice on how I should approach this.
The animation performance depends on how it's implemented. Without a minimal repro, it'll be challenging to figure out the issue. Though other options that you may consider to animate the background is by using AnimatedPositioned widget. Using this should help manipulate the position of widgets displayed on screen.
Another one is by using Adobe After Effects animations and rendering it with lottie plugin on your app.
Currently I am trying to record the movements and rotations of my SCNNode. I am recording the movement data to a csv and then examining it afterwards. Everything is working fine except for the fact that when you move the phone, the data changes because of the SCNNode changing in world space. To elaborate, the node isn't moving or rotating but the movement of the phone is messing with the data in a way that makes it looks like it's moving.
I have read Apple's documentation about ARSessionConfiguration.worldAlignment and I think it could be possible to cancel out the movement of the phone using the gravity property of the node (default worldAlignment).
Does anyone have any advice as to how I could achieve this?
Update:
As mentioned above my original approach to solving this was to change the ARSessionConfiguration. When you change that, the only thing that really changes is where the SCNNode starts in the world space. Therefore, the change did not effect how the movement is represented.
I'm trying to let Pepper do a really basic sequence of movements, with Choregraphe: a rotation, then one meter forward, than another rotation and finally one meter forward.
Most of the times that I'm running the behaviour, the sequence cannot be completed as the robot freezes. Every time I can hear the noise of the motors, but most of the times the robot won't move. Please consider that it is on a perfectly smooth surface.
Does anybody know what could be the reason of this problem? Do you have any suggestion on how to fix it?
The version of NAOqi is 2.5.5.5
The robot has a lot of safety. If the robot can't move because of an obstacle, the choregraphe box will say that your movement failed (grey output on the move To box) and cancel your flow. In your program, the flow will only continue if the movement is a success.
As mcaniot says the robot has some aggressive safety features and the robot may stop suddenly. However if you know what you are doing and accept the risk, you can disable the security in the web settings.
Read the specifics of the collision avoidance here:
http://doc.aldebaran.com/2-5/naoqi/motion/reflexes-external-collision.html#pepp-pepper
You can find the settings to disable it here:
Use this method to enable/disable settings:
http://doc.aldebaran.com/2-5/naoqi/motion/reflexes-external-collision-api.html#ALMotionProxy::setExternalCollisionProtectionEnabled__ssCR.bCR
I have programmed an Ipad application that has a behaviour that I would like to change if I put it in a wooden frame (any other material could be added). To simplify things, the background should change whenever is inside this frame, and there must be no tap-touch interaction, just putting the Ipad inside the frame.
Of course, we could program am specific gesture on the screen, like double tapping or swiping but it is not the searched solution.
Another thought has been to detect the lack of movement for a certain amount of time, but that would not assure that iPad is inside the frame.
I have thought about interacting with magnets (thinking about smartcovers) and the sleep sensor in the right side of the Ipad, but I don't know how to do it.
I cannot see any other useful sensor.
Any suggestion?
A combination of accelerometer and the camera seems like an idea worth trying out:
Scan the accelerometer data to detect a spike followed by a flat line (= putting the iPad into the frame, then resting).
After detecting the motion event, use the back camera (maybe combined with the flash) to detect a pattern image fixed inside of the frame for this purpose. It might be necessary to put the pattern into a little hole to create at least a blurry image.
The second step is there to distinguish the frame from any other surface the iPad might be placed upon.
I'm trying to build an app that has a moveable character as its main idea.
The concept:
Every time you shake your Iphone you will shake several parts of the character (arms, legs, torso, head) by its joints.
I managed to use accelerometer to detect the shake effect, but I'm not able to come to a conclusive approach on how to more the character's body parts.
Ideally, every time you shake the phone the character would move around with a gravity effect like.
Any idea about how I could achieve such effect in xcode?
Have you considered using a physics engine?
check out Box2D:
http://box2d.org
If you want a gravity like effect you could attack the limbs to a torso physics body and then apply the force to the "torso" of the character to get a ragdoll type of effect. You could base the force that you apply on how the device was shaken. I'm not sure if that's what your going for or not though for a shake effect.