Undefined is not an object (evaluating 'navigator.permissions.query') - javascript

I am getting this error when trying to access my website on an iPhone 7, with a white bank screen (the main screen loads fine, but then I get this at the net screen after I click something.
I assume this is what it's talking about:
useEffect(() => {
navigator.permissions
.query({ name: "microphone" })
.then((permissionStatus) => {
setMicrophonePermissionGranted(permissionStatus.state === "granted");
permissionStatus.onchange = function () {
setMicrophonePermissionGranted(this.state === "granted");
};
});
navigator.permissions.query({ name: "camera" }).then((permissionStatus) => {
setCameraPermissionGranted(permissionStatus.state === "granted");
permissionStatus.onchange = function () {
setCameraPermissionGranted(this.state === "granted");
};
});
}, []);
How do I fix this?

You need to check permission APIs availability and then if not available - query standard APIs.
Here is the location example:
Permissions API
Navigation API
if ( navigator.permissions && navigator.permissions.query) {
//try permissions APIs first
navigator.permissions.query({ name: 'geolocation' }).then(function(result) {
// Will return ['granted', 'prompt', 'denied']
const permission = result.state;
if ( permission === 'granted' || permission === 'prompt' ) {
_onGetCurrentLocation();
}
});
} else if (navigator.geolocation) {
//then Navigation APIs
_onGetCurrentLocation();
}
function _onGetCurrentLocation () {
navigator.geolocation.getCurrentPosition(function(position) {
//imitate map latlng construct
const marker = {
lat: position.coords.latitude,
lng: position.coords.longitude
};
})
}

Permissions.query() is marked as an experimental feature as of June 2021 https://developer.mozilla.org/en-US/docs/Web/API/Permissions/query.
As of today, that traduces into that you'll need to implement two UIs / flows; one capable of supporting fancy flows to tell the user how to proceed, and the other one more standard, using try / catch blocks. Something like:
useEffect(() => {
requestPermissions();
}, []);
const requestPermissions = async () => {
try {
handlePermissionsGranted();
const stream = await navigator.mediaDevices.getUserMedia({ audio: true, video: true });
startRecording();
} catch {
...
}
};
const handlePermissionsGranted = async () => {
if (navigator.permissions && navigator.permissions.query) {
const permissions = await navigator.permissions.query({name: 'microphone'});
permissions.onchange = () => {
setMicrophonePermissionGranted(permissions === 'granted');
};
}
};
const startRecording = async () => {
try {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true, video: false });
const mediaRecorder = new MediaRecorder(stream, { mimeType: 'audio/webm' });
...
} catch {
... << if you reach this catch means that either the browser does not support webrtc or that the user didn't grant permissions
}
};

I was trying to check for the mic and camera permissions from iOs devices and through the Facebook browser, which I guess makes the whole thing fail, as these don't exist in those environments.
Once I've moved that query to the component that only loads if it is not a mobile device, my error fixed.

Related

Why useFocusEffect doesn't notice when the screen is not on focus and comes back to focus

What I want is to notice when the user is on the app screen, or off the application(in the setting screen in this case)
The reason Im doing this is because I want to check the permissions of the user if it's "denied" or "granted".
and if its "denied" to not allow the user to navigate to other screen, and if its "granted" to allow the user to navigate to other screen.
const PermissionsIntro = ({ navigation}) => {
async function configurePushNotifications() {
const { status } = await Notifications.getPermissionsAsync();
let finalStatus = status;
console.log('status of notification', status)
if(status === 'granted'){
setNavigate(true)
}
else if (finalStatus !== 'granted') {
const { status } = await Notifications.requestPermissionsAsync();
finalStatus = status;
setNavigate(false)
}
else if (finalStatus !== 'granted') {
setNavigate(false)
Alert.alert('Permission required', 'Push notifications need the appropriate permissions.');
return;
}
const pushTokenData = await Notifications.getExpoPushTokenAsync();
setExpoPushToken(pushTokenData.data);
if (Platform.OS === 'android') {
Notifications.setNotificationChannelAsync('default', {
name: 'default',
importance: Notifications.AndroidImportance.DEFAULT,
});
}
}
useFocusEffect(
React.useCallback(() => {
configurePushNotifications();
alert('screen is on application')
console.log('tfffffff')
return () => {
openSettings()
console.log('screen is on settings')
// alert('screen is on settings')
};
}, [])
);
const openSettings = () => {
Linking.openSettings();
};
//onPress function
const confirm = () => {
console.log('navigate', navigate);
if (navigate == true) {
navigation.navigate('CreateProfile');
}
else {
console.log('turn the permissions to true!');
openSettings();
}
};
}
When I navigate to this screen it showed me the alert alert(screen is on application),
but when I go to the settings and go back to the application the useFocusEffect is not called at all.
How can I fix this error?
useFocusEffect is not able to run when app is coming to foreground/background unfortunately. In this case you should use AppState instead. AppState can tell you if the app is in the foreground or background, and notify you when the state changes. More can be found in the documentation.

Incorrect webview camera behavior

I have a face recognition app. You press the button - the camera turns on - face recognition is in progress - done. It works well in the browser on all devices.
There is also an app for ios and android. It also has the ability to take advantage of this recognition, but only with webview.
And for some reason, in webview, the camera does not work as it should. This happens - you press the button - a modal window appears asking you to give permission to use it or something like that - the camera opens to full screen without prompts & hints that should be, etc.. If I close this window with the livestream, the correct window opens with hints, but with the camera frozen on the last frame.
const startVideo = async () => {
options = new TinyFaceDetectorOptions();
if (
navigator.mediaDevices &&
navigator.mediaDevices.getUserMedia &&
await navigator.mediaDevices.enumerateDevices()
) {
// first we call getUserMedia to trigger permissions
// we need this before deviceCount, otherwise Safari doesn't return all the cameras
// we need to have the number in order to display the switch front/back button
navigator.mediaDevices
.getUserMedia({
audio: false,
video: true
})
.then((stream: MediaStream) => {
stream.getTracks().forEach((track: MediaStreamTrack) => {
track.stop();
});
if (videoElem.current && (videoElem.current.srcObject as MediaStream)) {
videoElem.current.srcObject = null;
}
// init the UI and the camera stream
initCameraStream();
})
.catch(error => {
// https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia
if (error === 'PermissionDeniedError') {
setModalStatus([modalsNames.videoAccessError, true]);
}
if (error.name === 'NotAllowedError') {
setModalStatus([modalsNames.videoAccessError, true]);
}
});
} else {
setModalStatus([modalsNames.cameraNotSupportedError, true]);
}};
init camera
const initCameraStream = () => {
// stop any active streams in the window
if (videoElem.current && (videoElem.current.srcObject as MediaStream)) {
(videoElem.current.srcObject as MediaStream)
.getTracks()
.forEach((track: MediaStreamTrack) => {
track.stop();
});
}
if (videoElem.current && (videoElem.current.srcObject as MediaStream)) {
videoElem.current.srcObject = null;
}
// we ask for a square resolution, it will cropped on top (landscape)
// or cropped at the sides (landscape)
const sizeH = 1280;
const sizeW = 1920;
const constraints = {
audio: false,
video: {
// width: { ideal: sizeW },
// height: { ideal: sizeH },
facingMode: currentFacingMode,
// aspectRatio: { exact: 1.777777778 }
}
};
const handleSuccess = (stream: MediaStream) => {
if (videoElem.current) {
videoElem.current.srcObject = stream;
videoElem.current.onloadedmetadata = () => {
if (videoElem.current) {
onPlay();
}
};
}
};
const handleError = () => {
setModalStatus([modalsNames.cameraNotSupportedError, true]);
};
navigator.mediaDevices
.getUserMedia(constraints)
.then(handleSuccess)
.catch(handleError);
};

async/await with Try and Catch block

I am fetching a user's location from Expo. While catch block always executes first and later location is still fetched correctly. Though, catch should happen only if the location was not fetched.
PS: It happens only when I am testing the app on real device under a Tunnel connection. However, works fine on the Emulator.
I don't see anything wrong with the Code though.
Please comment.
useEffect(() => {
const verifyPermissions = async () => {
const result = await Permissions.askAsync(Permissions.LOCATION);
if (result.status !== 'granted') {
Alert.alert(
'Insufficient permissions!',
'You need to grant location permissions to use this app.',
[{ text: 'Okay' }]
);
return false;
}
return true;
};
(async () => {
const hasPermission = await verifyPermissions();
if (!hasPermission) {
return;
}
try {
const location = await Location.getCurrentPositionAsync({});
setPickedLocation({
latitude: location.coords.latitude,
longitude: location.coords.longitude,
});
} catch (err) {
Alert.alert('Could not fetch location!', 'Please try again.', [
{ text: 'Okay' },
]);
}
})();
}, []);

Expo notifications - no sound

I'm building a React Native/Expo app that uses Push Notifications. For them to work I'm using package expo-notifications as shown below:
First, I'm setting notification handler
Notifications.setNotificationHandler({
handleNotification: async () => ({
shouldShowAlert: true,
shouldPlaySound: true,
shouldSetBadge: false
})
})
Then I get push token:
const registerForPushNotificationsAsync = async () => {
let pushToken
if (Constants.isDevice) {
const { status: existingStatus } = await Notifications.getPermissionsAsync()
let finalStatus = existingStatus
if (existingStatus !== 'granted') {
const { status } = await Notifications.requestPermissionsAsync()
finalStatus = status
}
if (finalStatus !== 'granted') {
alert(I18n.t('errors.push.token'))
return
}
pushToken = (await Notifications.getExpoPushTokenAsync()).data
AsyncStorage.getItem('push_token').then((savedToken) => {
if (pushToken !== savedToken) {
AsyncStorage.setItem('push_token', pushToken)
Networking.registerDeviceForPushNotifications(pushToken, Platform.OS)
}
})
console.log(pushToken)
} else {
alert(I18n.t('errors.push.physical'))
}
if (Platform.OS === 'android') {
Notifications.setNotificationChannelAsync('default', {
name: 'default',
importance: Notifications.AndroidImportance.MAX,
vibrationPattern: [0, 250, 250, 250],
lightColor: Colors.primary,
sound: 'default'
})
}
return pushToken
}
At the end I'm starting listeners:
notificationListener.current = Notifications.addNotificationReceivedListener(notification => {
setNotification(notification)
})
responseListener.current = Notifications.addNotificationResponseReceivedListener(response => {
// navigation here
})
The problem:
The notifications send to my app go through normally, but whatever I do I cannot get sound to work. I want my push messages to play a default system sound when they arrive, but there's nothing and I don't know why.
Your code snipped that refers to Notifications.setNotificationHandler only fires when the app is foregrounded according to the docs here. You will have to send a test notification on a physical device with the app foregrounded to hear a sound.
It may also help to add a custom sound. This is handled in the app.json file and will require a rebuild but you can see and example and read more here

Async function not returning on android

I'm having a problem with an async function not returning when running on android whereas it returns normally when run on iOS.
This is the function:
_getLocationAsync = async () => {
let {status} = await Permissions.askAsync(Permissions.LOCATION);
if (status !== 'granted') {
this.setState({
errorMessage: 'Permission to access location was denied',
});
}
let location = await Location.getCurrentPositionAsync({});
this.setState({location});
return location;
};
and I'm using it in another function here:
async fetchMarkers(settings ){
console.log("fetchMarkers");
// console.log(settings);
this.setState({listLoading: true});
let location = await this._getLocationAsync();
console.log("location is ", location);
....
....
}
This line is not returning in android, but it returns in ios. In android I tried logging the value of location just before returning it in _getLocationAsync and it logs a defined and correct object, I'm wondering why it's failing to return it then:
let location = await Location.getCurrentPositionAsync({});
I'm using React Native 0.53
I think there are some reasons that Android can't get location.
I'm using this location option, anh it
works well on android
// how accuracy should get location be
const GET_LOCATION_OPTIONS = {
enableHighAccuracy: false,
timeout: 20000,
maximumAge: 1000,
};
navigator.geolocation.getCurrentPosition(
(position) => {
const location = {
latitude: position.coords.latitude,
longitude: position.coords.longitude,
};
if (callback) { callback(location); }
},
(err) => {
if (callback) { callback(null); }
},
GET_LOCATION_OPTIONS,
);
Ref: https://facebook.github.io/react-native/docs/geolocation.html
maybe it's a permission problem,
check whether the app does apply the position permisson

Categories