Making a demo for text Recognition with camera using library react-native-camera but camera is not opening
DONE ALL THESE STEPS:
npm install react-native-camera --save
react-native link react-native-camera
Go to node_modules ➜ react-native-camera and add RNCamera.xcodeproj
Expand the RNCamera.xcodeproj ➜ Products folder
In XCode, in the project navigator, select your project. Add libRNCamera.a to your project's Build Phases ➜ Link Binary With
Libraries
Click RNCamera.xcodeproj in the project navigator and go the Build Settings tab. Make sure 'All' is toggled on (instead of
'Basic').
In the Search Paths section, look for Header Search Paths and make sure it contains both $(SRCROOT)/../../react-native/React and
$(SRCROOT)/../../../React - mark both as recursive
import { RNCamera } from 'react-native-camera';
camerascan(){
console.log("camscan=====")
return(
<RNCamera
ref={ref => {
this.camera = ref;
}}
defaultTouchToFocus
mirrorImage={false}
captureAudio={false}
style={{
flex: 1,
justifyContent: 'space-between',
alignItems: 'center',
height: Dimensions.get('window').height,
width: Dimensions.get('window').width,
}}
permissionDialogTitle={'Permission to use camera'}
permissionDialogMessage={'We need your permission to use your camera phone'}
>
<View
style={{
height: 56,
backgroundColor: 'transparent',
alignSelf: 'flex-end',
}}
>
<TouchableOpacity onPress={this.takePicture.bind(this)}>
<Text style={styles.capture}> [CAPTURE CARD]</Text>
</TouchableOpacity>
</View>
</RNCamera>
);
}
takePicture = async function() {
console.log("takePicture=====")
if (this.camera) {
// const options = { quality: 0.5, base64: true };
// const data = await this.camera.takePictureAsync(options)
const data = await this.camera.takePictureAsync();
console.warn('takePicture ', data);
// this.detectText(data.base64)
}
};
No error but camera is not opening.
Done gave the runtime permissions for camera
import Permissions from 'react-native-permissions'
componentDidMount()
{
this.determinePermission();
}
determinePermission(){
Permissions.request('camera', { type: 'always' }).then(response => {
this.setState({ locationPermission: response })
})
}
Related
I have yarn installed react-native-gesture-handler,I then cd'd into ios and pod installed. I then reran the react-native run-ios and still when I swipe nothing happens whatsoever. Im getting zero errors it just doesn't swipe whatsoever. Am i doing something wrong? I have tried to remedy this situation anyway possible and It just doesn't seem to swipe no matter what.
my code is as follows:
import React, {useState} from 'react';
import {
Platform,
View,
Text,
StyleSheet,
Image,
TouchableOpacity,
flatList,
} from 'react-native';
import Swipeable from 'react-native-gesture-handler/Swipeable';
const styles = StyleSheet.create({
container: {
padding: 20,
flexDirection: 'row',
backgroundColor: '#fff',
justifyContent: 'space-between',
alignItems: 'center',
},
text: {
fontSize: 18,
color: '#69696969',
},
icon: {
height: 30,
tintColor: '#69696969',
...Platform.select({
ios: {
tintColor: 'blue',
},
android: {
tintColor: 'red',
},
}),
},
separator: {
flex: 1,
height: 1,
backgroundColor: 'rgba(0, 0, 0, 0.2)',
},
});
export const Separator = () => <View style={styles.separator} />;
const LeftAction = () => {
<View>
<Text>test</Text>
</View>;
};
const ListItem = ({name, onFavoritePress}) => {
const [isFavorite, setIsFavorite] = useState(false);
let starIcon;
if (isFavorite) {
starIcon = Platform.select({
ios: require('../assets/icons/ios-star.png'),
android: require('../assets/icons/md-star.png'),
});
} else {
starIcon = Platform.select({
ios: require('../assets/icons/ios-star-outline.png'),
android: require('../assets/icons/md-star-outline.png'),
});
}
return (
<Swipeable renderLeftActions={LeftAction}>
<View style={styles.container}>
<Text style={styles.text}>{name}</Text>
{onFavoritePress && (
<TouchableOpacity
onPress={() => setIsFavorite((prevIsFavorite) => !prevIsFavorite)}>
<Image style={styles.icon} resizeMode="contain" source={starIcon} />
</TouchableOpacity>
)}
</View>
</Swipeable>
);
};
export default ListItem;
You need to link RNGH to RN
run react-native link react-native-gesture-handler
if this doesn't work maybe you don't have cocoapads dependencies
to install it
run cd <your-ios-code-directory> && pod install
I am building my first react-native app, and Implementing tabs using react-native-tabview. Stuck with the error :
"TypeError: undefined is not an object (evaluating '_this.props.navigationState.routes.length').
This is a screenshot of error I'm getting.
import * as React from 'react';
import {
Platform, StyleSheet, Text, View, Dimensions, StatusBar, FlatList, ImageBackground, TextInput
} from 'react-native';
import { TabView, SceneMap } from 'react-native-tab-view';
const FirstRoute = () => (
<View style={[styles.scene, { backgroundColor: '#ff4081' }]} />
);
const SecondRoute = () => (
<View style={[styles.scene, { backgroundColor: '#673ab7' }]} />
);
export default class App extends React.Component {
state = {
index: 0,
routes: [
{ key: 'first', title: 'First' },
{ key: 'second', title: 'Second' },
],
};
render() {
return (
<View style={styles.container}>
<TabView
navigationState={this.state}
renderScene={SceneMap({
first: FirstRoute,
second: SecondRoute,
})}
onIndexChange={index => this.setState({ index })}
initialLayout={{ width: Dimensions.get('window').width }}
style={styles.container}
/>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#F5FCFF',
marginTop: StatusBar.currentHeight
},
scene: {
flex: 1,
},
});
So I copy/pasted and ran your code (with a different background color) as an expo snack here: https://snack.expo.io/B1-xKYu2N and it's working.
If this is your first project, the most likely issue is missing or incorrectly installed packages. Double check your package.json for something like "react-native-tab-view": "^2.0.1". If it's there (or after you add it), try running rm -rf ./node_modules && npm install in terminal from inside the project directory to delete the packages and re-install them. Wish I could be more help!
I get this error. I am in Windows 10 and using expo sumulator in iphone 6 plus when I try to make run Camera. I try to follow the steps in this tutorial:
1 - npm install react-native-camera --save
2 - In XCode, in the project navigator, right click Libraries ➜ Add Files to [your project's name]
3 - Go to node_modules ➜ react-native-camera and add RNCamera.xcodeproj
4 - In XCode, in the project navigator, select your project. Add libRNCamera.a to your project's Build Phases ➜ Link Binary With Libraries
5 - Click RNCamera.xcodeproj in the project navigator and go the Build Settings tab. Make sure 'All' is toggled on (instead of 'Basic'). In the Search Paths section, look for Header Search Paths and make sure it contains both $(SRCROOT)/../../react-native/React and $(SRCROOT)/../../../React - mark both as recursive.
The problem is that I am in Windows and in windows XCODE doesn't exist. How do i fix this in Windows?
Please Give to me Some help. thank you...
My code is:
import React, { Component } from 'react';
import {Text, View,TouchableOpacity,TouchableHighlight} from 'react-native';
import Camera from 'react-native-camera';
const myStyle = {
container: {
flex: 1,
flexDirection: 'row',
},
preview: {
flex: 1,
justifyContent: 'flex-end',
alignItems: 'center'
},
capture: {
flex: 0,
backgroundColor: '#fff',
borderRadius: 5,
color: 'red',
padding: 10,
margin: 40
}
};
export default class CameraAcess extends Component {
constructor(props){
super (props);
this.state = {hasCameraPermission: null, type: Camera.Constants.Type.back,};
}
async componentWillMount() {
const { status } = await Permissions.askAsync(Permissions.CAMERA);
this.setState({ hasCameraPermission: status === 'granted' });
}
render() {
const {container,capture,preview} = myStyle;
const { hasCameraPermission } = this.state;
if (hasCameraPermission === null) {
return <View/>;
} else if (hasCameraPermission === false) {
return <Text>No access to camera</Text>;
} else {
return (
<View style={{ flex: 1 }}>
<Camera style={{ flex: 1 }} type={this.state.type}>
<View style={{flex: 1, backgroundColor: 'transparent', flexDirection: 'row', justifyContent:'space-between'}}>
<TouchableOpacity style={{flex: 0.1, alignSelf: 'flex-end', alignItems: 'center',}}
onPress={() => {this.setState({type: this.state.type === Camera.Constants.Type.back
? Camera.Constants.Type.front : Camera.Constants.Type.back,});}
}>
<Text style={{ fontSize: 18, marginBottom: 10, color: 'white' }}>{' '}Flip{' '}
</Text>
</TouchableOpacity>
<TouchableOpacity onPress={() => {this.props.navigator.push({id: 'MenuPrincipal'});}}
style={{alignSelf: 'flex-end', alignItems: 'center', backgroundColor:'transparent'}}>
<Text style={{ fontSize: 18, marginBottom: 10, color: 'white' ,}}>[Back]</Text>
</TouchableOpacity>
</View>
</Camera>
</View>
);
}
}
}
Iam trying to access camera but i get this error. I try to do all tutorial that people have show in other pages to make it work but i didn't have the same loki.
I enter at this file: \myproject\node_modules\lottie-ios\Example-Swift\Pods\Target Support Files\lottie-ios\Info.plist and add this:
<key>NSCameraUsageDescription</key>
<string>Feetit Would you like to access Camera</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>Feetit Would you like to access Photos</string>
Before that I install this:
npm install react-native-camera --save
npm install react-native link react-native-camera
After that I try this code for make Camera works:
import React, {Component} from 'react';
import Camera from 'react-native-camera';
import {View, TouchableHighlight, Text, StyleSheet} from 'react-native';
export default class CameraAcess extends Component {
takePicture() {
const options = {};
//options.location = ...
this.camera.capture({metadata: options})
.then((data) => console.log(data))
.catch(err => console.error(err));
}
onBarCodeRead(e) {
console.log(
"Barcode Found!", "Type: " + e.type + "\nData: " + e.data
);
}
render() {
return (
<View style={styles.container}>
<Camera
ref={(cam) => {
this.camera = cam;
}}
onBarCodeRead={this.onBarCodeRead.bind(this)}
style={styles.preview}>
<TouchableHighlight style={styles.capture} onPress={this.takePicture.bind(this)}>[CAPTURE]
<Text>Click me</Text>
</TouchableHighlight>
</Camera>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
flexDirection: 'row',
},
preview: {
flex: 1,
justifyContent: 'flex-end',
alignItems: 'center'
},
capture: {
flex: 0,
backgroundColor: '#fff',
borderRadius: 5,
color: '#000',
padding: 10,
margin: 40
}
});
I get this ERROR 'Can not read proprety 'Aspect' of undefined. The file of this error is other one, is Camera.js. Could Someone please tell me what i have to do to make it Work?
I looked over the docs and did not see this variable on the camera required however i did see in an example I built that I set this variable on the Camera. See if this helps as it may not start with a default value.
aspect={Camera.constants.Aspect.fill}
Change to your preference.
If you are facing this problem in iOS then
Remove the previously-installed react-native-camera pod.
$ cd ios
$ rm -rf Pods/
$ pod install
And then manually link
as mentioned in librarary readme
Plese check the permissions also which are required for camera.
Description
I need to write an e2e test that in some point it has to select an image in UIImagePickerController, I tried to use element(by.type('UIImagePickerController')). tapAtPoint() with no use. I need a way to select an image. I have found a way to do it with native tests.
Also mocking isn't an option for me since I use a higher version that the one react-native-repackeger needs.
Steps to Reproduce
Use with any application that uses image picker
Try to use element(by.type('UIImagePickerController')).tapAtPoint({ x: 50, y: 200 })
Detox, Node, Device, Xcode and macOS Versions
Detox: 6.0.2
Node: 8.9.0
Device: iOS Simulator 6s
Xcode: 9.2
macOS: 10.13.1
React-Native: 0.46.4
Device and verbose Detox logs
There's no logs, the device taps on the right location but the tap doesn't make an effect.
Noticed the original question stated that mocks were not an option in the case presented, but I came across this Stack Overflow question a few times in my searches for a solution and thought to share what I ultimately came up with for my situation.
I was able to get around the limitations for the e2e test by wrapping react-native-image-picker in my own export:
ImagePicker.js
import ImagePicker from 'react-native-image-picker';
export default ImagePicker;
And then creating a mock with a custom extension (i.e. e2e.js):
ImagePicker.e2e.js
const mockImageData = '/9j/4AAQSkZ...MORE BASE64 DATA OF CUTE KITTENS HERE.../9k=';
export default {
showImagePicker: function showImagePicker(options, callback) {
if (typeof options === 'function') {
callback = options;
}
callback({
data: mockImageData,
});
},
};
Finally, configure the metro bundler to prioritize your custom extension:
[project root]/rn-cli.config.js
const defaultSourceExts = require('metro-config/src/defaults/defaults')
.sourceExts;
module.exports = {
resolver: {
sourceExts: process.env.RN_SRC_EXT
? process.env.RN_SRC_EXT.split(',').concat(defaultSourceExts)
: defaultSourceExts,
},
};
Then run with the RN_SRC_EXT environment variable set to the custom extension:
RN_SRC_EXT=e2e.js react-native start
See the Detox Mocking Guide for more information.
Not sure if this is related, but for iOS 11 I can't even see those native view types in the Debug View Hierarchy.
For iOS 9 and 10 however, I would solve the problem like this:
it('select first image from camera roll', async () => {
// select a photo
await element(by.id('select_photo')).tap();
// Choose from Library...
await element(by.traits(['button']).and(by.type('_UIAlertControllerActionView'))).atIndex(1).tap();
// select Cemara Roll, use index 0 for Moments
await element(by.type('UITableViewCellContentView')).atIndex(1).tap();
// select first image
await element(by.type('PUPhotoView')).atIndex(0).tap();
});
There are probably many other possibilities to solve this problem with different native view types and accessibility traits.
I just used the example provided from react-native-image-picker to test with above code:
import React from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View,
PixelRatio,
TouchableOpacity,
Image,
} from 'react-native';
import ImagePicker from 'react-native-image-picker';
export default class App extends React.Component {
state = {
avatarSource: null,
videoSource: null
};
selectPhotoTapped() {
const options = {
quality: 1.0,
maxWidth: 500,
maxHeight: 500,
storageOptions: {
skipBackup: true
}
};
ImagePicker.showImagePicker(options, (response) => {
console.log('Response = ', response);
if (response.didCancel) {
console.log('User cancelled photo picker');
}
else if (response.error) {
console.log('ImagePicker Error: ', response.error);
}
else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
}
else {
let source = { uri: response.uri };
// You can also display the image using data:
// let source = { uri: 'data:image/jpeg;base64,' + response.data };
this.setState({
avatarSource: source
});
}
});
}
selectVideoTapped() {
const options = {
title: 'Video Picker',
takePhotoButtonTitle: 'Take Video...',
mediaType: 'video',
videoQuality: 'medium'
};
ImagePicker.showImagePicker(options, (response) => {
console.log('Response = ', response);
if (response.didCancel) {
console.log('User cancelled video picker');
}
else if (response.error) {
console.log('ImagePicker Error: ', response.error);
}
else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
}
else {
this.setState({
videoSource: response.uri
});
}
});
}
render() {
return (
<View style={styles.container}>
<TouchableOpacity testID="select_photo" onPress={this.selectPhotoTapped.bind(this)}>
<View style={[styles.avatar, styles.avatarContainer, {marginBottom: 20}]}>
{ this.state.avatarSource === null ? <Text>Select a Photo</Text> :
<Image style={styles.avatar} source={this.state.avatarSource} />
}
</View>
</TouchableOpacity>
<TouchableOpacity onPress={this.selectVideoTapped.bind(this)}>
<View style={[styles.avatar, styles.avatarContainer]}>
<Text>Select a Video</Text>
</View>
</TouchableOpacity>
{ this.state.videoSource &&
<Text style={{margin: 8, textAlign: 'center'}}>{this.state.videoSource}</Text>
}
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#F5FCFF'
},
avatarContainer: {
borderColor: '#9B9B9B',
borderWidth: 1 / PixelRatio.get(),
justifyContent: 'center',
alignItems: 'center'
},
avatar: {
borderRadius: 75,
width: 150,
height: 150
}
});
AppRegistry.registerComponent('example', () => App);