E2E: Select an image from a UIImagePickerController with Wix Detox - ios

Description
I need to write an e2e test that in some point it has to select an image in UIImagePickerController, I tried to use element(by.type('UIImagePickerController')). tapAtPoint() with no use. I need a way to select an image. I have found a way to do it with native tests.
Also mocking isn't an option for me since I use a higher version that the one react-native-repackeger needs.
Steps to Reproduce
Use with any application that uses image picker
Try to use element(by.type('UIImagePickerController')).tapAtPoint({ x: 50, y: 200 })
Detox, Node, Device, Xcode and macOS Versions
Detox: 6.0.2
Node: 8.9.0
Device: iOS Simulator 6s
Xcode: 9.2
macOS: 10.13.1
React-Native: 0.46.4
Device and verbose Detox logs
There's no logs, the device taps on the right location but the tap doesn't make an effect.

Noticed the original question stated that mocks were not an option in the case presented, but I came across this Stack Overflow question a few times in my searches for a solution and thought to share what I ultimately came up with for my situation.
I was able to get around the limitations for the e2e test by wrapping react-native-image-picker in my own export:
ImagePicker.js
import ImagePicker from 'react-native-image-picker';
export default ImagePicker;
And then creating a mock with a custom extension (i.e. e2e.js):
ImagePicker.e2e.js
const mockImageData = '/9j/4AAQSkZ...MORE BASE64 DATA OF CUTE KITTENS HERE.../9k=';
export default {
showImagePicker: function showImagePicker(options, callback) {
if (typeof options === 'function') {
callback = options;
}
callback({
data: mockImageData,
});
},
};
Finally, configure the metro bundler to prioritize your custom extension:
[project root]/rn-cli.config.js
const defaultSourceExts = require('metro-config/src/defaults/defaults')
.sourceExts;
module.exports = {
resolver: {
sourceExts: process.env.RN_SRC_EXT
? process.env.RN_SRC_EXT.split(',').concat(defaultSourceExts)
: defaultSourceExts,
},
};
Then run with the RN_SRC_EXT environment variable set to the custom extension:
RN_SRC_EXT=e2e.js react-native start
See the Detox Mocking Guide for more information.

Not sure if this is related, but for iOS 11 I can't even see those native view types in the Debug View Hierarchy.
For iOS 9 and 10 however, I would solve the problem like this:
it('select first image from camera roll', async () => {
// select a photo
await element(by.id('select_photo')).tap();
// Choose from Library...
await element(by.traits(['button']).and(by.type('_UIAlertControllerActionView'))).atIndex(1).tap();
// select Cemara Roll, use index 0 for Moments
await element(by.type('UITableViewCellContentView')).atIndex(1).tap();
// select first image
await element(by.type('PUPhotoView')).atIndex(0).tap();
});
There are probably many other possibilities to solve this problem with different native view types and accessibility traits.
I just used the example provided from react-native-image-picker to test with above code:
import React from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View,
PixelRatio,
TouchableOpacity,
Image,
} from 'react-native';
import ImagePicker from 'react-native-image-picker';
export default class App extends React.Component {
state = {
avatarSource: null,
videoSource: null
};
selectPhotoTapped() {
const options = {
quality: 1.0,
maxWidth: 500,
maxHeight: 500,
storageOptions: {
skipBackup: true
}
};
ImagePicker.showImagePicker(options, (response) => {
console.log('Response = ', response);
if (response.didCancel) {
console.log('User cancelled photo picker');
}
else if (response.error) {
console.log('ImagePicker Error: ', response.error);
}
else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
}
else {
let source = { uri: response.uri };
// You can also display the image using data:
// let source = { uri: 'data:image/jpeg;base64,' + response.data };
this.setState({
avatarSource: source
});
}
});
}
selectVideoTapped() {
const options = {
title: 'Video Picker',
takePhotoButtonTitle: 'Take Video...',
mediaType: 'video',
videoQuality: 'medium'
};
ImagePicker.showImagePicker(options, (response) => {
console.log('Response = ', response);
if (response.didCancel) {
console.log('User cancelled video picker');
}
else if (response.error) {
console.log('ImagePicker Error: ', response.error);
}
else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton);
}
else {
this.setState({
videoSource: response.uri
});
}
});
}
render() {
return (
<View style={styles.container}>
<TouchableOpacity testID="select_photo" onPress={this.selectPhotoTapped.bind(this)}>
<View style={[styles.avatar, styles.avatarContainer, {marginBottom: 20}]}>
{ this.state.avatarSource === null ? <Text>Select a Photo</Text> :
<Image style={styles.avatar} source={this.state.avatarSource} />
}
</View>
</TouchableOpacity>
<TouchableOpacity onPress={this.selectVideoTapped.bind(this)}>
<View style={[styles.avatar, styles.avatarContainer]}>
<Text>Select a Video</Text>
</View>
</TouchableOpacity>
{ this.state.videoSource &&
<Text style={{margin: 8, textAlign: 'center'}}>{this.state.videoSource}</Text>
}
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#F5FCFF'
},
avatarContainer: {
borderColor: '#9B9B9B',
borderWidth: 1 / PixelRatio.get(),
justifyContent: 'center',
alignItems: 'center'
},
avatar: {
borderRadius: 75,
width: 150,
height: 150
}
});
AppRegistry.registerComponent('example', () => App);

Related

React Native : IOS app stuck at splash screen

My IOS app stuck at splash screen, I can't debug because no log when app stuck at splash, it looks like app can't loading or something else which I don't know. This is my App.js :
import AppLoading from 'expo-app-loading';
import { Asset } from 'expo-asset';
import { createDrawerNavigator } from '#react-navigation/drawer';
const MyDrawer = createDrawerNavigator();
export default class App extends Component<Props> {
state = {
isReady: false,
};
async _cacheResourcesAsync() {
const images = [require('./src/img/logo_0.png')];
const cacheImages = images.map(image => {
return Asset.fromModule(image).downloadAsync();
});
return Promise.all(cacheImages);
}
render() {
console.log(this.state.isReady);
if (!this.state.isReady) {
return(
<AppLoading
startAsync={this._cacheResourcesAsync}
onFinish={() => this.setState({ isReady: true })}
onError={console.warn}
autoHideSplash={true}
/>
);
}
return (
<MyDrawer.Navigator initialRouteName="Main" drawerContentOptions={{
activeTintColor: '#FFF',
itemStyle: { marginVertical: 5 },
}}
drawerStyle={{
backgroundColor: '#ea2d49',
width: 240,
marginTop:'23%',
color: '#fff'
}}
>
<MyDrawer.Screen name="Home" component={StackScreen}/>
<MyDrawer.Screen name="Contact US" component={ContactStackScreen}/>
<MyDrawer.Screen name="Privacy Policy" component={PrivacyStackScreen}/>
</MyDrawer.Navigator>
);
}
}
I build with expo.
Thank you everyone!

expo-image-picker gives network request failed when i upload an image in ios but it works fine on android

This is the code that i used to upload images on IOS
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.Image,
allowsEditing: true,
quality: 1,
});
And in the output i got,
Object {
"cancelled": false,
"height": 2001,
"type": "image",
"uri": "file:///Users/sajid/Library/Developer/CoreSimulator/Devices/A48F2141-D9AF-457A-9D14-D2F2D4B6336B/data/Containers/Data/Application/803F58D8-0CF9-43CA-8972-599C460687B1/Library/Caches/ExponentExperienceData/%2540sajid_542%252FGoTrillo/ImagePicker/19952322-2D5E-495B-A4AF-427EA23663A6.jpg",
"width": 3000,
}
Till this point it works fine, later when i call fetch on result.uri, it gives me following error on ios ,
const response = await fetch(result.uri);
Output Error: TypeError: Network request failed
This Error is not beinng raise on Android where as in IOS its giving this error when i use the above functions for the second time.Someone please sugggest a fix for this issue.
I found a solution after lots of search on google,
We need to replace file:// with /private for a fix in ios physical device.
You can refer this link for extra inforamtion
There is an another workaround for this problem while Expo Team handles patching the react native version. For iOS you can use this implementation and for android regular XHR would be sufficient, detailed issue here.
I had the same issue for a long time. This is what I did.
Make sure you follow the amplify guide for setting up a app. amplify init, amplify add auth, amplify push, and then amplify add storage and then do this.
import Amplify, { Storage } from 'aws-amplify'
import config from './src/aws-exports'
// import awsconfig from './aws-exports';
// Might need to switch line 7 to awsconfig
Amplify.configure(config)
import { StatusBar } from 'expo-status-bar';
import React, { useState, useEffect } from 'react';
import { Button, Image, View, Platform, StyleSheet, Text, TextInput } from 'react-native';
import * as ImagePicker from 'expo-image-picker';
function App() {
const [image, setImage] = useState(null)
const [name, setName] = useState('Evan Erickson')
useEffect(() => {
(async () => {
if (Platform.OS !== 'web') {
const { status } = await ImagePicker.requestMediaLibraryPermissionsAsync();
if (status !== 'granted') {
alert('Sorry, we need camera roll permissions to make this work!');
}
}
})();
}, []);
const pickImage = async () => {
let result = await ImagePicker.launchImageLibraryAsync({
mediaTypes: ImagePicker.MediaTypeOptions.All,
allowsEditing: true,
aspect: [4, 3],
quality: 1,
});
console.log(result)
async function pathToImageFile(data) {
try {
const response = await fetch(data);
const blob = await response.blob();
await Storage.put(`customers/${name}`, blob, {
contentType: 'image/jpeg', // contentType is optional
});
} catch (err) {
console.log('Error uploading file:', err);
}
}
// later
pathToImageFile(result.uri);
}
return (
<View style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
<Button title="Pick an image from camera roll" onPress={pickImage} />
{image && <Image source={{ uri: image }} style={{ width: 200, height: 200 }} />}
<Button title="Upload image" onPress={() => {alert(image)}} />
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#fff',
alignItems: 'center',
justifyContent: 'center',
},
});
export default withAuthenticator(App)
Here is the another way using which u can solve this problem and its work for me as well.
https://github.com/expo/expo/issues/10394#issuecomment-700509863
import * as FileSystem from 'expo-file-system';
import * as firebase from 'firebase/app';
import * as ImagePicker from 'expo-image-picker';
// Pick the photo
const pickerResult = await ImagePicker.launchImageLibraryAsync({
allowsEditing: true,
aspect: [1, 1]
});
// Fetch the photo with it's local URI
const file = await FileSystem.readAsStringAsync(pickerResult.uri, {
encoding: FileSystem.EncodingType.Base64,
});
// Create a ref in Firebase (I'm using my user's ID)
const ref = firebase.storage().ref().child(`avatars/${user.uid}`);
// Upload Base64 image to Firebase
const snapshot = await ref.putString(file, 'base64');
// Create a download URL
const remoteURL = await snapshot.ref.getDownloadURL();
// Return the URL
return remoteURL;
I had the same error. I solved it with the following:
const localUri = result.uri;
const filename = localUri.split('/').pop();
const match = /\.(\w+)$/.exec(filename);
const type = match ? `image/${match[1]}` : `image`;
const obj = { uri: localUri, name: filename, type };
And then you can use this obj to upload in the server using FormData.

On Application Resume From Background To Foreground, App Restarts From First Navigation Screen

I have following navigation stack
const AppNavigator = createStackNavigator({
AppSplashScreen: AppSplashScreen,
LanguageScreen: LanguageScreen,
WalkthroughScreen: WalkthroughScreen,
LoginScreen: LoginScreen,
ForgotPasswordScreen: ForgotPasswordScreen,
ResetPasswordScreen: ResetPasswordScreen,
RegistrationTypeScreen: RegistrationTypeScreen,
RegistrationFormScreen: RegistrationFormScreen,
OTPConfirmationScreen: OTPConfirmationScreen,
BottomTabNavigator: BottomTabNavigator
}, {
headerMode: 'none',
cardStyle: { backgroundColor: '#000000' },
});
const AppContainer = createAppContainer(AppNavigator);
export default App;
I am displaying splash screen video when the first app opens.
Here is what my AppSplashScreen looks like
import React, { Component } from 'react';
import { View } from 'react-native';
import SplashScreen from 'react-native-splash-screen';
import Video from 'react-native-video';
import { VIDEO_SPLASH_2 } from '../assets/videos/index';
export default class AppSplashScreen extends Component {
state = {
displayVideoPlayer: true,
firstLaunch: false
}
componentDidMount() {
SplashScreen.hide();
}
componentWillUnmount() {
this.setState({
displayVideoPlayer: false
});
}
isFirstLaunch() {
let firstLaunch = true;
if (true === storage.get('APP_ALREADY_LAUNCHED')) {
firstLaunch = false;
} else {
storage.set('APP_ALREADY_LAUNCHED', true);
firstLaunch = true;
}
return firstLaunch;
}
didCompleteVideoPlayback() {
if (true === this.state.displayVideoPlayer) {
this.setState({
displayVideoPlayer: false
});
}
const currentRouteName = this.props.navigation.state.routeName;
if ('AppSplashScreen' !== currentRouteName) {
return false;
}
if (true === global.SKIP_SPLASH_SCREEN_REDIRECT) {
return false;
}
if (this.isFirstLaunch()) {
this.props.navigation.navigate('LanguageScreen');
return false;
}
this.props.navigation.navigate('HomeScreen');
}
render() {
return (
<View style={{flex: 1, backgroundColor: '#000000', alignItems: 'center', justifyContent: 'center'}}>
{true === this.state.displayVideoPlayer &&
<Video
source={VIDEO_SPLASH_2}
muted={true}
repeat={false}
playInBackground={false}
resizeMode="contain"
onEnd={() => this.didCompleteVideoPlayback()}
style={{height: '100%', width: '100%', backgroundColor: '#000000'}}
/>
}
</View>
);
}
}
My issue is, whenever I put the application in Background, and resume after 30 seconds, it always starts with AppSplashScreen whereas I expect it to resume from the last screen. It works correctly if I open it before 30 seconds. I assume somewhere it is killing the memory and starting the app from start when I resume after 30 second.
What could be the issue here. Or is there another workaround to resume the app in the same screen where the user left off.
I solved it by using State Persistence of react-navigation
Here is the documentation https://reactnavigation.org/docs/4.x/state-persistence/
Here is what my App.js look like now
import AsyncStorage from '#react-native-community/async-storage';
const App: () => React$Node = () => {
const persistenceKey = "navigationStatePersistenceKey"
const persistNavigationState = async (navState) => {
try {
await AsyncStorage.setItem(persistenceKey, JSON.stringify(navState));
} catch(err) {
// handle error
}
}
const loadNavigationState = async () => {
const jsonString = await AsyncStorage.getItem(persistenceKey);
return JSON.parse(jsonString);
}
return(
<View style={{flex: 1, backgroundColor: '#000000'}}>
<AppContainer
persistNavigationState={persistNavigationState}
loadNavigationState={loadNavigationState}
/>
</View>
);
};
It now takes user to the same screen where it was left off, no more restart from first screen.

storing data on iOS device using react native

I am new to React Native and trying to create a simple iOS app. The app has a button on clicking which I need to store the timestamp of the click on the device in a file.
I know that React Native has an API called AsyncStorage but I am getting errors while using this. I copied the code from some site on the net.
Can someone please guide me to use this API?
This is my entire code:
import React, {Component} from 'react';
import { StyleSheet, Text, View, TextInput, AsyncStorage } from 'react-native';
export default class App extends Component{
state = {
'name': ''
}
componentDidMount = () => AsyncStorage.getItem('name').then((value) => this.setState({'name': value}))
setName = (value) => {
AsyncStorage.setItem('name': value);
this.setState({'name': value});
}
render() {
return (
<View style={styles.container}>
<TextInput style = {styles.textInput} autoCapitalize = 'none'
onChangeText = {this.setName}/>
<Text>
{this.state.name}
</Text>
</View>
);
}
}
const styles = StyleSheet.create({
container: {
flex: 1,
alignItems: 'center',
marginTop: 50
},
textInput: {
margin: 15,
height: 35,
borderWidth: 1,
backgroundColor: '#7685ed'
}
});
As for the error, when I launch the code on iOS, I am getting a red screen. There is no syntax error that I can see.
Thanks in advance.
Hard to say without more detail the exact problem you're facing, but I assume some of the following might help you?
Ah I see you posted some code. You will need a constructor that defines your state as well. Added it in my code below.
Please note I'm not an expert. Forgive any errors
import {
AsyncStorage,
} from 'react-native';
class myComponent extends React.Component{
constructor(props) {
super(props);
this.state = {
data: null
};
}
componentDidMount() {
this._loadInitialState().done();
}
_someFunction() {
var myData = 123;
saveItemLocally('data', myData);
}
async _loadInitialState() {
try {
// get localy stored data
var dataStored = await AsyncStorage.getItem('data');
if (dataStored!==null) {
this.setState({
data: dataStored
});
}
} catch (error) {
//didn't get locally stored data
console.log(error.message);
}
} // end _loadinitialstate
render () {
//your render function
return (
);
}
} // end of your component
async function saveItemLocally(item, value) {
try {
await AsyncStorage.setItem(item, value);
} catch (error) {
console.log('AsyncStorage error: ' + error.message);
}
}

navigator.geolocation.watchPosition not working correctly on ios device

I have a strange behavior and can not find a solution. In my React Native application I'm using the navigator.geolocation to determine the current position. My application is almost identical as the example in Facebook's Geolocation Code.
In the simulator my application works perfectly, but when I deploy it to my iphone the returning position (by getCurrentPosition and watchPosition) is not correctly. For example speed and heading is -1, accuracy is 65. Longitude and latitude seem to be valid, but not my real position (is in another country).
It does not matter whether I deploy the application via debug or release to the iphone, it always reacts same.
But if I start for example TomTom and put this application in the background, and then start my GeolocationExample application, everything works as it should.
My application I have created as follows:
react-native init GeolocationExample
And then I replaced the file index.ios.js with this code.
import React, { Component } from 'react';
import {
AppRegistry,
StyleSheet,
Text,
View
} from 'react-native';
export default class GeolocationExample extends React.Component {
state = {
initialPosition: 'unknown',
lastPosition: 'unknown',
};
watchID: ?number = null;
componentDidMount() {
navigator.geolocation.getCurrentPosition(
(position) => {
var initialPosition = JSON.stringify(position);
this.setState({initialPosition});
},
(error) => alert(JSON.stringify(error)),
{enableHighAccuracy: true, timeout: 20000, maximumAge: 1000}
);
this.watchID = navigator.geolocation.watchPosition((position) => {
var lastPosition = JSON.stringify(position);
this.setState({lastPosition});
});
}
componentWillUnmount() {
navigator.geolocation.clearWatch(this.watchID);
}
render() {
return (
<View style={styles.main}>
<Text>
<Text style={styles.title}>Initial position: </Text>
{this.state.initialPosition}
</Text>
<Text>
<Text style={styles.title}>Current position: </Text>
{this.state.lastPosition}
</Text>
</View>
);
}
}
var styles = StyleSheet.create({
main: {
margin: 30
},
title: {
fontWeight: '500',
},
});
AppRegistry.registerComponent('GeolocationExample', () => GeolocationExample);
1st option : add parameter
{enableHighAccuracy: true, timeout: 20000, maximumAge: 1000, accuracy:10 //ten meters not in the document}
2nd option : set line 28
#define RCT_DEFAULT_LOCATION_ACCURACY kCLLocationAccuracyBest //default was kCLLocationAccuracyHundredMeters

Resources