React Native and iOS share button - ios

I am trying to access iOS' share button where you can share content to all services, including messages etc...
Any idea how I could do this? Thanks

You now have a simple Share API in react-native.
import { Share } from "react-native"
Share.share(
{
title: "a title",
message: "some message",
// or
url: imageReference
},
(options)
);
See http://facebook.github.io/react-native/docs/share.html

You can achieve this out of the box in React Native - just use ActionSheetIOS.showShareActionSheetWithOptions. See the documentation here.

You might want to check out the react-native-share package, it should cover your usecase. You can also see more relevant packages on JS.Coach

It's much easier than you think. Adding to #MoOx answer further.
With the new share api available, you can easily share information with your React Native app by just using it with all variables and configuration. (see here)
import React from 'react';
import { Share, View, Button } from 'react-native';
const ShareExample = () => {
const onShare = async () => {
try {
const result = await Share.share({
message:
'React Native | A framework for building native apps using React',
});
if (result.action === Share.sharedAction) {
if (result.activityType) {
// shared with activity type of result.activityType
} else {
// shared
}
} else if (result.action === Share.dismissedAction) {
// dismissed
}
} catch (error) {
alert(error.message);
}
};
return (
<View style={{ marginTop: 50 }}>
<Button onPress={onShare} title="Share" />
</View>
);
};
export default ShareExample;

Related

React native expo implementing Apple App Tracking Transparency (ATT) for iOS 14.5

What is the best way of implementing the Apple App Transparency Tracker (ATT) feature on react native expo? My app keeps getting rejected by apple even after I add:
app.json file :
"infoPlist": {
"NSUserTrackingUsageDescription": "App requires permission...."
}
On iOS 14 Apple introduced the App Tracking Transparency permission to access IDFA.
You need to prompt the user whether it allows your app to use libraries that track them or not, adding it on infoPlist just allows you to use this API within your application.
Expo doesn't have this feature yet, but some libraries you can use to prompt the permission
Example: https://docs.expo.io/versions/v41.0.0/sdk/facebook/#facebookgetpermissionsasync
You can use other libraries , such as https://github.com/mrousavy/react-native-tracking-transparency
where you can request the App tracking like
import { getTrackingStatus } from 'react-native-tracking-transparency';
const trackingStatus = await getTrackingStatus();
if (trackingStatus === 'authorized' || trackingStatus === 'unavailable') {
// enable tracking features
}
import { requestTrackingPermission } from 'react-native-tracking-transparency';
const trackingStatus = await requestTrackingPermission();
if (trackingStatus === 'authorized' || trackingStatus === 'unavailable') {
// enable tracking features
}
This might need an update in a near future, as expo releases a new SDK version with a solution for that.
EDIT
From Expo 44+
Expo now have a library for TrackTransparency: (https://docs.expo.dev/versions/latest/sdk/tracking-transparency/)
expo install expo-tracking-transparency
For bare applications: https://github.com/expo/expo/tree/main/packages/expo-tracking-transparency#installation-in-bare-react-native-projects
You can add it as a plugin at your app.json
{
"expo": {
"plugins": [
[
"expo-tracking-transparency",
{
"userTrackingPermission": "This identifier will be used to deliver personalized ads to you."
}
]
]
}
}
And now you can use like this:
import React, { useEffect } from 'react';
import { Text, StyleSheet, View } from 'react-native';
import { requestTrackingPermissionsAsync } from 'expo-tracking-transparency';
export default function App() {
useEffect(() => {
(async () => {
const { status } = await requestTrackingPermissionsAsync();
if (status === 'granted') {
console.log('Yay! I have user permission to track data');
}
})();
}, []);
return (
<View style={styles.container}>
<Text>Tracking Transparency Module Example</Text>
</View>
);
}
You need to request Tracking permissions first (I used react-native-permissions):
import { request, RESULTS, PERMISSIONS } from 'react-native-permissions'
export const requestPermissionTransparency = async () => {
return await request(PERMISSIONS.IOS.APP_TRACKING_TRANSPARENCY)
}
useEffect(() => {
;(async () => {
const result = await requestPermissionTransparency()
if (result === RESULTS.GRANTED) {
//You need to enable analytics (fb,google,etc...)
await firebase.analytics().setAnalyticsCollectionEnabled(true)
console.log('Firebase Analytics: ENABLED')
}
})()
}, [])
Remember to add this file in the root project:
// <project-root>/firebase.json
{
"react-native": {
"analytics_auto_collection_enabled": false
}
}
References: https://rnfirebase.io/analytics/usage
The solution I ended up using from expo was using the Facebook.getPermissionsAsync()
https://expo.canny.io/feature-requests/p/expo-permissions-add-support-to-apptrackingtransparency-permission-on-ios
Expo 41+
TrackingTransparency:
https://docs.expo.io/versions/latest/sdk/tracking-transparency/
import { requestTrackingPermissionsAsync } from 'expo-tracking-transparency';
const { status } = await requestTrackingPermissionsAsync();
if (status === 'granted') // do something
Expo 40 and below
Admob: https://docs.expo.io/versions/latest/sdk/admob/
import { requestPermissionsAsync } from 'expo-ads-admob'
const { status } = await requestPermissionsAsync()
if (status === 'granted') // do something

Deeplinking with a domain name

I have the following code in my App.js:
import React, { useState, useRef, useEffect } from 'react';
import { SafeAreaView, Text } from 'react-native';
import { NavigationContainer, useLinking } from '#react-navigation/native';
import { createStackNavigator } from '#react-navigation/stack';
const Stack = createStackNavigator();
const Screen1 = () => <SafeAreaView><Text>Screen1</Text></SafeAreaView>;
const Screen2 = () => <SafeAreaView><Text>Screen2</Text></SafeAreaView>;
export default function App() {
const ref = useRef();
const [isReady, setIsReady] = useState(false);
const [initialState, setInitialState] = useState();
const { getInitialState } = useLinking(ref, {
prefixes: ['http://example.com', 'mychat://'],
config: {
screens: {
Screen2: 'screen-2',
},
},
});
useEffect(() => {
getInitialState().then((state) => {
if (state !== undefined) setInitialState(state);
setIsReady(true);
});
}, [getInitialState]);
if (!isReady) return null;
return (
<NavigationContainer ref={ref} initialState={initialState}>
<Stack.Navigator>
<Stack.Screen name='Screen1' component={Screen1} />
<Stack.Screen name='Screen2' component={Screen2} />
</Stack.Navigator>
</NavigationContainer>
);
}
Most of them are copied from https://reactnavigation.org/docs/deep-linking/ and https://reactnavigation.org/docs/use-linking/.
In the docs there is prefixes: ['https://mychat.com', 'mychat://'], I just changed https://mychat.com to http://example.com. But it doesn't seem to work.
When I open the following links in Safari:
mychat:// (works, gets redirected to app Screen1)
mychat://screen-2 (works, gets redirected to app Screen2)
http://example.com (just opens the link in the browser, no popup to redirect to app)
What change do I need to make to redirect the domain name to the mobile app? Am I missing something?
You need to use a domain that you have access to alongside a server.
Your server needs to host a couple of files, typically within the .well-known directory:
apple-app-site-association (note the .json is not needed)
assetlinks.json
You also need to enable some entitlements within your app for iOS, this may also be true for Android. On iOS, this will be enabling the Associated Domains entitlement alongside an entry for webcredentials:yourdomain.com
The documentation is pretty good to go through to give an understanding on what needs to be done in order to achieve Universal Links
https://developer.apple.com/library/archive/documentation/General/Conceptual/AppSearch/UniversalLinks.html
https://developer.android.com/training/app-links/verify-site-associations
Examples:
iOS - https://stackoverflow.com/.well-known/apple-app-site-association
Android - https://stackoverflow.com/.well-known/assetlinks.json

No camera found on Chrome IOS with CapacitorJS and Ionic

I have created a PWA with ionic and capacitor JS following this guide: https://ionicframework.com/docs/react/your-first-app.
After i have added it to firebase and start testet the code, i have runned into a issue on Chrome on IOS mobiles. It works on android and also web browsers.
But when i click the take photo button it says "No camera found" and the browser dont ask to let me use the camera. If i try the same thing on Safari, then it asks for the camera.
Here is the url to see the test: https://phototest-46598.web.app/tab1
Does anybody experience the same problem? My guess is that it is a new problem since the guide seems to work without problems.
Here is my code - i have followed the linked tutorial but not added native support because i only want to use it as a PWA.
hooks/usePhotoGallery.js file
import { useState, useEffect } from "react";
import { useCamera } from '#ionic/react-hooks/camera';
import { CameraResultType, CameraSource, CameraPhoto, Capacitor,
FilesystemDirectory } from "#capacitor/core";
export function usePhotoGallery() {
const { getPhoto } = useCamera();
const [photos, setPhotos] = useState<Photo[]>([]);
const takePhoto = async () => {
const cameraPhoto = await getPhoto({
resultType: CameraResultType.Uri,
source: CameraSource.Camera,
quality: 100
});
const fileName = new Date().getTime() + '.jpeg';
const newPhotos = [{
filepath: fileName,
webviewPath: cameraPhoto.webPath
}, ...photos];
setPhotos(newPhotos)
};
return {
photos,
takePhoto
};
}
export interface Photo {
filepath: string;
webviewPath?: string;
base64?: string;
}
Tab2.tsx file
import React from 'react';
import { camera, trash, close } from 'ionicons/icons';
import { IonContent, IonHeader, IonPage, IonTitle, IonToolbar,
IonFab, IonFabButton, IonIcon, IonGrid, IonRow,
IonCol, IonImg, IonActionSheet } from '#ionic/react';
import ExploreContainer from '../components/ExploreContainer';
import { usePhotoGallery } from '../hooks/usePhotoGallery';
import './Tab2.css';
const Tab2: React.FC = () => {
const { photos, takePhoto } = usePhotoGallery();
return (
<IonPage>
<IonHeader>
<IonToolbar>
<IonTitle>Photo Gallery</IonTitle>
</IonToolbar>
</IonHeader>
<IonContent>
<IonGrid>
<IonRow>
{photos.map((photo, index) => (
<IonCol size="6" key={index}>
<IonImg src={photo.webviewPath} />
</IonCol>
))}
</IonRow>
</IonGrid>
<IonFab vertical="bottom" horizontal="center" slot="fixed">
<IonFabButton onClick={() => takePhoto()}>
<IonIcon icon={camera}></IonIcon>
</IonFabButton>
</IonFab>
</IonContent>
</IonPage>
);
};
export default Tab2;
The camera plugin when running on web uses navigator.mediaDevices.getUserMedia, which is not supported on Chrome for iOS (prior to iOS 14.3).
I've run your app and it says "No camera found" and under it there is a "Choose image" button, if you click it you'll be prompted to take a picture or choose from the photo library, that's the expected behavior, when there is no Camera or no support for navigator.mediaDevices.getUserMedia it fallbacks to using a input with type file.

React native: TypeError: null is not an object (evaluating 'SplashScreen.preventAutoHide')

My react native app was working just fine before I used expo eject. I ejected it because I now intend to build and release the app to the ios app store. As soon as I attempt to start the ejected app using react-native run-ios after it's been ejected I get the exception below.
Please could someone help to understand what's causing this issue and how to tackle it?
react Native versions as follows:
react-native-cli: 2.0.1
react-native: 0.61.5
TypeError: null is not an object (evaluating 'SplashScreen.preventAutoHide')
This error is located at:
in AppLoading (at AppLoading.js:52)
in AppLoading (at App.js:464)
in App (at renderApplication.js:40)
in RCTView (at AppContainer.js:101)
in RCTView (at AppContainer.js:119)
in AppContainer (at renderApplication.js:39)
preventAutoHide
SplashScreen.js:4:21
AppLoading#constructor
AppLoadingNativeWrapper.js:6:8
renderRoot
[native code]:0
runRootCallback
[native code]:0
renderApplication
renderApplication.js:52:52
runnables.appKey.run
AppRegistry.js:116:10
runApplication
AppRegistry.js:197:26
callFunctionReturnFlushedQueue
[native code]:0
The AppLoading component is not available in the bare workflow. As #gaurav-roy said you have to refactor your code.
Install the expo-splash-screen package with npm install expo-splash-screen
Add a splash-screen to your Android and iOS projects. Run npm run expo-splash-screen --help and follow the instructions of this CLI tool. (Because of a bug you might have to run the command again with the -p "ios" flag if it only adds the SplashScreen for Android after running it.
Change your code inside App.tsx in a similar way as in this example.
If you're working with hooks you probably want to add an useEffect
hook with an empty dependency list which runs an async function. Here an example of how it could be done:
const App = (props: Props) => {
const [isLoadingComplete, setLoadingComplete] = useState(false);
const init = async () => {
try {
// Keep on showing the SlashScreen
await SplashScreen.preventAutoHideAsync();
await loadResourcesAsync();
} catch (e) {
console.warn(e);
} finally {
setLoadingComplete(true);
// Hiding the SplashScreen
await SplashScreen.hideAsync();
}
useEffect(() => {
init();
}, []);
const renderApp = () => {
if (!isLoadingComplete && !props.skipLoadingScreen) {
return null;
}
return (
<Main />
);
};
return <StoreProvider>{renderApp()}</StoreProvider>;
}
As its evident from docs , SplashScreen is an inbuilt api for expo apps, and since you ejected it , it throws an error since it cant be used.
You can see this in the docs expo splashscreen .
First you should download npm i expo-splash-screen
And then change your import statement to :
import * as SplashScreen from 'expo-splash-screen';
Hope it helps. feel free for doubts
After looking through this SO page and then digging into some links, especially this expo page where they kind of provide a solution for this, I was able to get my app running after about 3 hours of struggle. They haven't added any functional component example, so I am sharing my code below in case someone came here looking for the solution.
import { Asset } from "expo-asset";
import * as Font from "expo-font";
import React, { useState, useEffect } from "react";
import { Platform, StatusBar, StyleSheet, View } from "react-native";
import { Ionicons } from "#expo/vector-icons";
import * as SplashScreen from 'expo-splash-screen';
import AppNavigator from "./navigation/AppNavigator";
export default props => {
const [isLoadingComplete, setLoadingComplete] = useState(false);
const theme = {
...DefaultTheme,
roundness: 2,
colors: {
...DefaultTheme.colors,
primary: "#E4002B",
accent: "#E4002B",
},
};
useEffect(() => {
async function asyncTasks() {
try {
await SplashScreen.preventAutoHideAsync();
} catch (e) {
console.warn(e);
}
await loadResourcesAsync()
setLoadingComplete(true);
}
asyncTasks()
}, []);
return (
!isLoadingComplete && !props.skipLoadingScreen ? null :
<View style={styles.container}>
{Platform.OS === "ios" && <StatusBar barStyle="default" />}
<AppNavigator />
</View>
);
}
async function loadResourcesAsync() {
await Promise.all([
Asset.loadAsync([
require("./assets/images/logo.png") // Load your resources here (if any)
]),
Font.loadAsync({
// You can remove this if you are not loading any fonts
"space-mono": require("./assets/fonts/SpaceMono-Regular.ttf"),
}),
]);
await SplashScreen.hideAsync();
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: "#fff",
},
});
This solved it for me for an ejected expo app. Looks like expo was referencing it wrongly.
https://github.com/expo/expo/issues/7718#issuecomment-610508510
What worked for me was updating node_modules/expo/build/launch/splashScreen.js to the following as suggested by adamsolomon1986 in the repo (issue #7718):
import { NativeModules } from 'react-native';
import* as SplashScreen from 'expo-splash-screen'
export function preventAutoHide() {
if (SplashScreen.preventAutoHide) {
SplashScreen.preventAutoHide();
}
}
export function hide() {
if (SplashScreen.hide) {
SplashScreen.hide();
}
}
//# sourceMappingURL=SplashScreen.js.map

How to open Apple Maps using Flutter/Dart?

Our Flutter App shows a number of locations using Google Maps, if available, or else using the local browser.
Although we had already previously uploaded a binary code for iOS which was accepted by Apple and successfully published in the App Store, now that we have added some more locations and thus attempted to publish a new version, Apple has rejected our binary, stating that it is mandatory to use "Apple Maps" instead of anything that starts with a "G", like Google...
The rejection message reads as follows:
Your app's location feature is not integrated with the built-in mapping functionality, which limits users to a third-party maps app.
Next Steps
To resolve this issue, please revise your app to give users the option to launch the native Apple Maps app.
I have found that there exists some documentation about a Javascript library named MapKit JS, which serves precisely the purpose of interacting with Apple Maps: https://developer.apple.com/maps/mapkitjs/
<script src="https://cdn.apple-mapkit.com/mk/5.x.x/mapkit.js"></script>
<script>
mapkit.init({
authorizationCallback: function(done) {
var xhr = new XMLHttpRequest();
xhr.open("GET", "/services/jwt");
xhr.addEventListener("load", function() {
done(this.responseText);
});
xhr.send();
}
});
var Cupertino = new mapkit.CoordinateRegion(
new mapkit.Coordinate(37.3316850890998, -122.030067374026),
new mapkit.CoordinateSpan(0.167647972, 0.354985255)
);
var map = new mapkit.Map("map");
map.region = Cupertino;
</script>
Nevertheless, I could really use some help no how to connect with this MapKit JS using DART, instead of JAVA, for our Flutter application.
Thank you immensely for your kind help!
Daniel
Firstly, install the url_launcher plugin
Secondly, add the below code in Info.plist:
<key>LSApplicationQueriesSchemes</key>
<array>
<string>googlechromes</string>
<string>comgooglemaps</string>
</array>
Thirdly:
var urlAppleMaps = 'https://maps.apple.com/?q=$lat,$lng';
if (await canLaunch(urlAppleMaps)) {
await launch(urlAppleMaps);
} else {
throw 'Could not launch $url';
}
We can use it like this:
_launchMap(BuildContext context, lat, lng) async {
var url = '';
var urlAppleMaps = '';
if (Platform.isAndroid) {
url = "https://www.google.com/maps/search/?api=1&query=${lat},${lng}";
} else {
urlAppleMaps = 'https://maps.apple.com/?q=$lat,$lng';
url = "comgooglemaps://?saddr=&daddr=$lat,$lng&directionsmode=driving";
if (await canLaunch(url)) {
await launch(url);
} else {
throw 'Could not launch $url';
}
}
if (await canLaunch(url)) {
await launch(url);
} else if (await canLaunch(urlAppleMaps)) {
await launch(urlAppleMaps);
} else {
throw 'Could not launch $url';
}
}
You could try to use a Map Launcher plugin to launch apple/google maps depending on a platform
import 'dart:io';
import 'package:map_launcher/map_launcher.dart';
if (Platform.isIOS) {
await MapLauncher.launchMap(
mapType: MapType.apple,
coords: Coords(31.233568, 121.505504),
title: "Shanghai Tower",
description: "Asia's tallest building",
);
} else {
await MapLauncher.launchMap(
mapType: MapType.google,
coords: Coords(31.233568, 121.505504),
title: "Shanghai Tower",
description: "Asia's tallest building",
);
}
Just an idea but maybe you can try to use the url_launch plugin to launch a url following the schemata given in the apple maps url schemata given here: https://developer.apple.com/library/archive/featuredarticles/iPhoneURLScheme_Reference/MapLinks/MapLinks.html#//apple_ref/doc/uid/TP40007899-CH5-SW1
A simple solution I have found is to replace the google maps URL with a maps.apple.com equivalent, using the same url_launcher function.
For example, the original function for Google Maps:
void _mapaBethania() async {
const url = "https://www.google.com/maps/place/Panamanian+Institute+for+Special+Training/#9.0067418,-79.5300556,17z/data=!3m1!4b1!4m5!3m4!1s0x8faca84549395297:0x9c54b1fdb96ac590!8m2!3d9.0067365!4d-79.5278669";
if (await canLaunch(url)) {
await launch(url);
} else {
throw 'Lo sentimos, no es posible abrir: $url';
}
}
Has become:
void _mapaBethania() async {
const url = "https://maps.apple.com/?q=IPHE&ll=9.0067418,-79.5300556&z=16";
if (await canLaunch(url)) {
await launch(url);
} else {
throw 'Lo sentimos, no es posible abrir: $url';
}
}
Nevertheless, although this change does fulfill Apple's enforced obligation to use their own software instead of the competition's, the user experience with Apple Maps is very poor, because once the Map App is shown, it becomes a bit confusing and difficult to return back to the original App.
Therefore, I am planning to write code that enables both options, Apple Maps in order to comply with Apple enforcement, and also Google Maps in order to provide a better user experience, despite Apple.
Anyway, the latter is just a personal opinion; the true fact is that replacing the URL for a maps.apple.com equivalent using the same launch_url function, does seem acceptable to comply with Apple requirements.

Resources