How To Fix .NativeModules.RNGetRandomValues.getRandomBase64 Error - ios

I am getting the following error when trying to run my react-native app with iOS device and I am unsure why, any ideas? Just a heads up the app works fine on android simulator.
ERROR TypeError: null is not an object (evaluating '_$$_REQUIRE(_dependencyMap[0], "react-native").NativeModules.RNGetRandomValues.getRandomBase64')
I am building a Tik Tok clone and I am trying to publish a video I record on my device to my AWS database.
So far I have tried to:
adding import 'react-native-get-random-values';,
running the app in release mode,
My code doesn't work when I try to publish the video I record on my phone to the database. I get the error posted above and a warning a created letting me know the video hasn't been published. Again I am only having this error on the iOS side of the application.
Here is my code for the screen:
import React, {
useState,
useRef,
useEffect
} from 'react';
import {
View,
Text,
TouchableOpacity,
TextInput,
Button
} from 'react-native';
import styles from '/Users/Documents/TikTok/src/screens/CreatePost/styles.js';
import {
Storage,
API,
graphqlOperation,
Auth
} from 'aws-amplify';
import {
useRoute,
useNavigation
} from '#react-navigation/native';
import {
createPost
} from '/Users/Documents/TikTok/src/graphql/mutations.js';
import {
v4 as uuidv4
} from 'uuid';
const CreatePost = () => {
const [description, setDescription] = useState();
const [videoKey, setVideoKey] = useState();
const route = useRoute();
const navigation = useNavigation();
const uploadToStorage = async(imagePath) => {
try {
const response = await fetch(imagePath);
const blob = await response.blob();
const filename = `${uuidv4()}.mp4`;
const s3Response = await Storage.put(filename, blob);
setVideoKey(s3Response.key);
} catch (e) {
console.error(e);
}
};
useEffect(() => {
uploadToStorage(route.params.videoUri);
}, []);
const onPublish = async() => {
if (!videoKey) {
console.warn("Video is not yet uploaded");
return;
}
try {
const userInfo = await Auth.currentAuthenticatedUser();
const newPost = {
videoUri: videoKey,
description: description,
userID: userInfo.attributes.sub,
songID: '6957a5ce-5f8b-40b0-9f6b-aa68eba19c2b',
};
const response = await API.graphql(
graphqlOperation(createPost, {
input: newPost
}),
);
navigation.navigate("Home", {
screen: "Home"
});
console.warn('Video Uploaded');
} catch (e) {
console.log(e);
}
};
return ( <
View style = {
styles.container
} >
<
TextInput value = {
description
}
onChangeText = {
setDescription
}
numberOfLines = {
5
}
placeholder = {
"Post Description"
}
style = {
styles.textInput
}
/> <
TouchableOpacity onPress = {
onPublish
} >
<
View style = {
styles.button
} >
<
Text style = {
styles.buttonText
} > Publish < /Text> <
/View> <
/TouchableOpacity> <
/View>
);
};
export default CreatePost;

Related

How to respond with a stream in a Sveltekit server load function

Below I try to respond with a stream when I receive ticker updates.
+page.server.js:
import YahooFinanceTicker from "yahoo-finance-ticker";
const ticker = new YahooFinanceTicker();
const tickerListener = await ticker.subscribe(["BTC-USD"])
const stream = new ReadableStream({
start(controller) {
tickerListener.on("ticker", (ticker) => {
console.log(ticker.price);
controller.enqueue(ticker.price);
});
}
});
export async function load() {
return response????
};
Note: The YahooFinanceTicker can't run in the browser.
How to handle / set the response in the Sveltekit load function.
To my knowledge, the load functions cannot be used for this as their responses are JS/JSON serialized. You can use an endpoint in +server to return a Response object which can be constructed from a ReadableStream.
Solution: H.B. comment showed me the right direction to push unsollicited price ticker updates the client.
api route: yahoo-finance-ticker +server.js
import YahooFinanceTicker from "yahoo-finance-ticker";
const ticker = new YahooFinanceTicker();
const tickerListener = await ticker.subscribe(["BTC-USD"])
/** #type {import('./$types').RequestHandler} */
export function GET({ request }) {
const ac = new AbortController();
console.log("GET api: yahoo-finance-ticker")
const stream = new ReadableStream({
start(controller) {
tickerListener.on("ticker", (ticker) => {
console.log(ticker.price);
controller.enqueue(String(ticker.price));
}, { signal: ac.signal });
},
cancel() {
console.log("cancel and abort");
ac.abort();
},
})
return new Response(stream, {
headers: {
'content-type': 'text/event-stream',
}
});
}
page route: +page.svelte
<script>
let result = "";
async function getStream() {
const response = await fetch("/api/yahoo-finance-ticker");
const reader = response.body.pipeThrough(new TextDecoderStream()).getReader();
while (true) {
const { value, done } = await reader.read();
console.log("resp", done, value);
if (done) break;
result += `${value}<br>`;
}
}
getStream();
</script>
<section>
<p>{#html result}</p>
</section>

iOS peers cannot connect to video call - NotAllowedError raised

PeerJS: 1.3.2
Tested on: iOS 15 & 13.
I have the below call service file that implements PeerJS functionality to init, establish and answer video calls.
Calls work as expected across Android devices, macOS and PCs.
However, when attempting to join from an iOS device, we see the following error raised:
NotAllowedError: The request is not allowed by the user agent
or the platform in the current context, possibly because the
user denied permission.
call-service.js:
import { Injectable } from '#angular/core';
import { MatSnackBar } from '#angular/material/snack-bar';
import Peer from 'peerjs';
import { BehaviorSubject, Subject } from 'rxjs';
import { v4 as uuidv4 } from 'uuid';
#Injectable()
export class CallService {
private peer: Peer;
private mediaCall: Peer.MediaConnection;
private localStreamBs: BehaviorSubject<MediaStream> = new BehaviorSubject(null);
public localStream$ = this.localStreamBs.asObservable();
private remoteStreamBs: BehaviorSubject<MediaStream> = new BehaviorSubject(null);
public remoteStream$ = this.remoteStreamBs.asObservable();
private isCallStartedBs = new Subject<boolean>();
public isCallStarted$ = this.isCallStartedBs.asObservable();
constructor(private snackBar: MatSnackBar) { }
public initPeer(): string {
if (!this.peer || this.peer.disconnected) {
const peerJsOptions: Peer.PeerJSOption = {
debug: 3,
config: {
iceServers: [
{
urls: [
'stun:stun1.l.google.com:19302',
'stun:stun2.l.google.com:19302',
],
}]
}
};
try {
let id = uuidv4();
this.peer = new Peer(id, peerJsOptions);
return id;
} catch (error) {
console.error(error);
}
}
}
public async establishMediaCall(remotePeerId: string) {
try {
const stream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true});
let peerOptions: any = {};
if (this.checkSafari()) {
peerOptions.serialization = "json";
}
const connection = this.peer.connect(remotePeerId, peerOptions);
connection.on('error', err => {
console.error(err);
this.snackBar.open(err, 'Close');
});
this.mediaCall = this.peer.call(remotePeerId, stream);
if (!this.mediaCall) {
let errorMessage = 'Unable to connect to remote peer';
this.snackBar.open(errorMessage, 'Close');
throw new Error(errorMessage);
}
this.localStreamBs.next(stream);
this.isCallStartedBs.next(true);
this.mediaCall.on('stream',
(remoteStream) => {
this.remoteStreamBs.next(remoteStream);
});
this.mediaCall.on('error', err => {
this.snackBar.open(err, 'Close');
console.error(err);
this.isCallStartedBs.next(false);
});
this.mediaCall.on('close', () => this.onCallClose());
}
catch (ex) {
console.error(ex);
this.snackBar.open(ex, 'Close');
this.isCallStartedBs.next(false);
}
}
public async enableCallAnswer() {
try {
let peerOptions: any = {};
if (this.checkSafari()) {
peerOptions.serialization = "json";
}
const stream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true });
this.localStreamBs.next(stream);
this.peer.on('call', async (call) => {
this.mediaCall = call;
this.isCallStartedBs.next(true);
this.mediaCall.answer(stream);
this.mediaCall.on('stream', (remoteStream) => {
this.remoteStreamBs.next(remoteStream);
});
this.mediaCall.on('error', err => {
this.snackBar.open(err, 'Close');
this.isCallStartedBs.next(false);
console.error(err);
});
this.mediaCall.on('close', () => this.onCallClose());
});
}
catch (ex) {
console.error(ex);
this.snackBar.open(ex, 'Close');
this.isCallStartedBs.next(false);
}
}
private onCallClose() {
this.remoteStreamBs?.value.getTracks().forEach(track => {
track.stop();
});
this.localStreamBs?.value.getTracks().forEach(track => {
track.stop();
});
this.snackBar.open('Call Ended', 'Close');
}
public closeMediaCall() {
this.mediaCall?.close();
if (!this.mediaCall) {
this.onCallClose()
}
this.isCallStartedBs.next(false);
}
public destroyPeer() {
this.mediaCall?.close();
this.peer?.disconnect();
this.peer?.destroy();
}
public checkSafari() {
let seemsChrome = navigator.userAgent.indexOf("Chrome") > -1;
let seemsSafari = navigator.userAgent.indexOf("Safari") > -1;
return seemsSafari && !seemsChrome;
}
}
Closing. This was a local permissions issue on my test device and no fault of PeerJS.
Reinstalling Chrome on iOS then enabled the relevant camera permissions

expo-location not working in ios and not showing location permission in app settings

I am trying to get current location in IOS 14, but i am getting no response and when i check in expo
settings it's not showing location permission there. I have checked in both simulator and physical device.
Hook Code
import { useEffect, useState } from "react";
import * as Location from "expo-location";
export default useLocation = () => {
const [location, setLocation] = useState();
const getLocation = async () => {
try {
const { granted } = await Location.requestPermissionsAsync();
if (!granted) return;
const {
coords: { latitude, longitude },
} = await Location.getLastKnownPositionAsync();
setLocation({ latitude, longitude });
} catch (error) {
console.log(error);
}
};
useEffect(() => {
getLocation();
}, []);
return location;
};
Response
undefined
The docs says Location.getLastKnownPositionAsync() might return null:
Returns a promise resolving to an object of type LocationObject or
null if it's not available or doesn't match given requirements such as
maximum age or required accuracy.
so you should do something like:
import { useEffect, useState } from "react";
import * as Location from "expo-location";
export default useLocation = () => {
const [location, setLocation] = useState();
const getLocation = async () => {
try {
const { granted } = await Location.requestPermissionsAsync();
if (!granted) return;
const last = await Location.getLastKnownPositionAsync();
if (last) setLocation(last);
else {
const current = await Location.getCurrentPositionAsync();
setLocation(current);
}
} catch (error) {
console.log(error);
}
};
useEffect(() => {
getLocation();
}, []);
return location;
};
use requestForegroundPermissionAsync() instead of requestPermissionAsync. and the problem is solved.

custom editor react-data-grid

I have weird issue when trying to create a custom autocomplete editor.
Basicly what I've done is I've pulled the built-in AutocompleteEditor class and refactored it to plain ES6, and renamed the class to ProductSelectEditor. No modifications to the code logic.
When I try to use it, I'm getting error "Cannot read property 'onCommit' of undefined" when handleChange() is called:
handleChange() {
this.props.onCommit(); // props undefined
}
Now if i replace the editor with the real built-in AutocompleteEditor, it works just fine. I can't see any straight reason, why my custom version does not work, when only alterations I'm doing are refactoring the code away from TypeScript, renaming the class, and eventually exporting the class out as default?
Any clues on what I'm not understanding here?
Below is the whole refactored code
import React from 'react'
import ReactDOM from 'react-dom'
import ReactAutocomplete from 'ron-react-autocomplete';
import PropTypes from 'prop-types';
import '../css/ron-react-autocomplete.css'
const { shapes: { ExcelColumn } } = require('react-data-grid')
let optionPropType = PropTypes.shape({
id: PropTypes.required,
title: PropTypes.string
});
export default class ProductSelectEditor extends React.Component {
static propTypes = {
onCommit: PropTypes.func,
options: PropTypes.arrayOf(optionPropType),
label: PropTypes.any,
value: PropTypes.any,
height: PropTypes.number,
valueParams: PropTypes.arrayOf(PropTypes.string),
column: PropTypes.shape(ExcelColumn),
resultIdentifier: PropTypes.string,
search: PropTypes.string,
onKeyDown: PropTypes.func,
onFocus: PropTypes.func,
editorDisplayValue: PropTypes.func
};
static defaultProps = {
resultIdentifier: 'id'
};
handleChange() {
this.props.onCommit();
}
getValue() {
let value;
let updated = {};
if (this.hasResults() && this.isFocusedOnSuggestion()) {
value = this.getLabel(this.autoComplete.state.focusedValue);
if (this.props.valueParams) {
value = this.constuctValueFromParams(this.autoComplete.state.focusedValue, this.props.valueParams);
}
} else {
value = this.autoComplete.state.searchTerm;
}
updated[this.props.column.key] = value;
return updated;
}
getEditorDisplayValue() {
let displayValue = {title: ''};
let { column, value, editorDisplayValue } = this.props;
if (editorDisplayValue && typeof editorDisplayValue === 'function') {
displayValue.title = editorDisplayValue(column, value);
} else {
displayValue.title = value;
}
return displayValue;
}
getInputNode() {
return ReactDOM.findDOMNode(this).getElementsByTagName('input')[0];
}
getLabel(item) {
let label = this.props.label != null ? this.props.label : 'title';
if (typeof label === 'function') {
return label(item);
} else if (typeof label === 'string') {
return item[label];
}
}
hasResults() {
return this.autoComplete.state.results.length > 0;
}
isFocusedOnSuggestion() {
let autoComplete = this.autoComplete;
return autoComplete.state.focusedValue != null;
}
constuctValueFromParams(obj, props) {
if (!props) {
return '';
}
let ret = [];
for (let i = 0, ii = props.length; i < ii; i++) {
ret.push(obj[props[i]]);
}
return ret.join('|');
}
render() {
let label = this.props.label != null ? this.props.label : 'title';
return (<div height={this.props.height} onKeyDown={this.props.onKeyDown}>
<ReactAutocomplete search={this.props.search} ref={(node) => this.autoComplete = node} label={label} onChange={this.handleChange} onFocus={this.props.onFocus} resultIdentifier={this.props.resultIdentifier} options={this.props.options} value={this.getEditorDisplayValue()} />
</div>);
}
}
Alright, after few hours of poking and mangling found the reason for the props to be undefined. Apparently after stripping out the Typescripts, I needed to re-bind 'this' in order to get the correct context:
<ReactAutocomplete ... onChange={this.handleChange.bind(this)} ... />

cordova-plugin-media-with-compression on iOS can't seem to play audio files after moving

I am using cordova-plugin-media-with-compression in an Ionic 2 app.
On iOS I can record and playback if I pass startRecord() a filename and call that again without changing this.media.
I can't seem to play audio files stored elsewhere in the file system - as I have to pass a new src to startRecord() and this is the bit I think I am doing incorrectly.
import { Component } from '#angular/core';
import { ModalController, LoadingController, ToastController, Platform } from 'ionic-angular';
import { File, FileEntry, Entry, FileError, DirectoryEntry} from 'ionic-native';
declare var Media: any; // stops errors w/ cordova-plugin-media-with-compression types
#Component({
selector: 'page-add-doc',
templateUrl: 'add-doc.html'
})
export class AddDocPage {
isRecording = false;
isRecorded = false;
audioUrl ='';
localAudioUrl = '';
media: any;
newFileName: string;
newFileNameM4A: string;
homerAudio = 'http://techslides.com/demos/samples/sample.m4a'
constructor(private modalCtrl: ModalController,
private loadingCtrl: LoadingController,
private toastCtrl: ToastController,
private platform: Platform,
) {
platform.ready()
.then(() => {
console.log('Platform Ready');
});
}
ionViewDidLoad() {
this.newFileName = new Date().getTime().toString();
this.newFileNameM4A = this.newFileName +'.m4a';
}
onRecordAudio() {
this.media = new Media(this.newFileNameM4A);
this.media.startRecord();
this.isRecording = true;
}
onStopRecordAudio() {
this.media.stopRecord();
this.media.release();
this.isRecording = false;
this.isRecorded = true;
try {
File.copyFile(File.tempDirectory, this.newFileNameM4A, File.dataDirectory, this.newFileNameM4A)
.then(
(data: Entry) => {
this.audioUrl = data.nativeURL;
});
} catch (FileError) {
console.log(FileError)
};
}
onPlayback() {
this.media = new Media(this.newFileNameM4A);
this.media.play();
this.media.release();
}
onPlaybackTempDirectory() {
this.media = new Media(File.tempDirectory + this.newFileNameM4A);
this.media.play();
this.media.release();
}
onPlaybackDataDirectory() {
this.media = new Media(File.dataDirectory + this.localAudioUrl);
this.media.play();
this.media.release();
}
onHomerAudio() {
this.media = new Media(this.homerAudio)
this.media.play();
this.media.release();
}
}
Believe I may have solved this with the answer from https://issues.apache.org/jira/browse/CB-7007

Resources