I'm trying to send data to a chromecast, but I would like to send the data to a certain Chromecast directly, without selecting it in the Google Chrome.
I would like to skip the Chromecast selection before sending data.
This is want to avoid.
I dont want to select the cast but directly cast the data to it.
I've been checking the session object that we get from chrome.cast.initialize and it return something like this:
{
"sessionId": "b59f1754-fd13-48cd-b237-4952a69cade4",
"appId": "5B797F56",
"displayName": "url-cast-sender",
"statusText": "URL Cast ready...",
"receiver": {
"label": "rTflOUigItAIYPwoZZ87Uv5oK8yI.",
"friendlyName": "Sala de Juntas",
"capabilities": [
"video_out",
"audio_out"
],
"volume": {
"controlType": "attenuation",
"level": 1,
"muted": false,
"stepInterval": 0.05000000074505806
},
"receiverType": "cast",
"isActiveInput": null,
"displayStatus": null
},
"senderApps": [],
"namespaces": [
{
"name": "urn:x-cast:com.google.cast.debugoverlay"
},
{
"name": "urn:x-cast:com.url.cast"
}
],
"media": [],
"status": "connected",
"transportId": "b59f1754-fd13-48cd-b237-4952a69cade4"
};
As you can see there is label there, I've been trying to work with it but nothing.
The way the page request the connection to a chromecast is the following:
// click handlers
document.getElementById('requestSession').onclick = function () {
chrome.cast.requestSession(sessionListener, onErr);
};
Which seems to be the part that opens the selection alert in Google Chrome.
My work is a fork from url-cast-receiver and you can check a demo here.
Turns out it is not possible from the frontend part.
So I ended up using a library called SharpCaster created by Tapanila, in which there is a controller that allows you to do this kind of stuff, here you can find an example of it.
Had some trouble to make it work and also opened an issue in the repository, but ended up fixing it myself, issue #141.
WebPageCastingTester.cs
using System.Linq;
using System.Threading.Tasks;
using SharpCaster.Controllers;
using SharpCaster.Services;
using Xunit;
namespace SharpCaster.Test
{
public class WebPageCastingTester
{
private ChromecastService _chromecastService;
public WebPageCastingTester()
{
_chromecastService = ChromecastService.Current;
var device = _chromecastService.StartLocatingDevices().Result;
_chromecastService.ConnectToChromecast(device.First()).Wait(2000);
}
[Fact]
public async void TestingLaunchingSharpCasterDemo()
{
var controller = await _chromecastService.ChromeCastClient.LaunchWeb();
await Task.Delay(4000);
Assert.NotNull(_chromecastService.ChromeCastClient.ChromecastStatus.Applications.First(x => x.AppId == WebController.WebAppId));
await controller.LoadUrl("https://www.windytv.com/");
await Task.Delay(4000);
Assert.Equal(_chromecastService.ChromeCastClient.ChromecastStatus.Applications.First(x => x.AppId == WebController.WebAppId).StatusText,
"Now Playing: https://www.windytv.com/");
}
}
}
Related
I'm using Dynamoose to simplify my interactions with DynamoDB in a node.js application. I'm trying to write a query using Dynamoose's Model.query function that will search a table using an index, but it seems like Dynamoose is not including all of the info required to process the query and I'm not sure what I'm doing wrong.
Here's what the schema looks like:
const UserSchema = new dynamoose.Schema({
"user_id": {
"hashKey": true,
"type": String
},
"email": {
"type": String,
"index": {
"global": true,
"name": "email-index"
}
},
"first_name": {
"type": String,
"index": {
"global": true,
"name": "first_name-index"
}
},
"last_name": {
"type": String,
"index": {
"global": true,
"name": "last_name-index"
}
}
)
module.exports = dynamoose.model(config.usersTable, UserSchema)
I'd like to be able to search for users by their email address, so I'm writing a query that looks like this:
Users.query("email").contains(query.email)
.using("email-index")
.all()
.exec()
.then( results => {
res.status(200).json(results)
}).catch( err => {
res.status(500).send("Error searching for users: " + err)
})
I have a global secondary index defined for the email field:
When I try to execute this query, I'm getting the following error:
Error searching for users: ValidationException: Either the KeyConditions or KeyConditionExpression parameter must be specified in the request.
Using the Dynamoose debugging output, I can see that the query winds up looking like this:
aws:dynamodb:query:request - {
"FilterExpression": "contains (#a0, :v0)",
"ExpressionAttributeNames": {
"#a0": "email"
},
"ExpressionAttributeValues": {
":v0": {
"S": "mel"
}
},
"TableName": "user_qa",
"IndexName": "email-index"
}
I note that the actual query sent to DynamoDB does not contain KeyConditions or KeyConditionExpression, as the error message indicates. What am I doing wrong that prevents this query from being written correctly such that it executes the query against the global secondary index I've added for this table?
As it turns out, calls like .contains(text) are used as filters, not query parameters. DynamoDB can't figure out if the text in the index contains the text I'm searching for without looking at every single record, which is a scan, not a query. So it doesn't make sense to try to use .contains(text) in this context, even though it's possible to call it in a chain like the one I constructed. What I ultimately needed to do to make this work is turn my call into a table scan with the .contains(text) filter:
Users.scan({ email: { contains: query.email }}).all().exec().then( ... )
I am not familiar with Dynamoose too much but the following code below will do an update on a record using node.JS and DynamoDB. See the key parameter I have below; by the error message you got it seems you are missing this.
To my knowledge, you must specify a key for an UPDATE request. You can checks the AWS DynamoDB docs to confirm.
var params = {
TableName: table,
Key: {
"id": customerID,
},
UpdateExpression: "set customer_name= :s, customer_address= :p, customer_phone= :u, end_date = :u",
ExpressionAttributeValues: {
":s": customer_name,
":p": customer_address,
":u": customer_phone
},
ReturnValues: "UPDATED_NEW"
};
await docClient.update(params).promise();
I have been trying to figure out how to do 2fa with webauthn and I have the registration part working. The details are really poorly documented, especially all of the encoding payloads in javascript. I am able to register a device to a user, but I am not able to authenticate with that device. For reference, I'm using these resources:
https://github.com/cedarcode/webauthn-ruby
https://www.passwordless.dev/js/mfa.register.js
And specifically, for authentication, I'm trying to mimic this js functionality:
https://www.passwordless.dev/js/mfa.register.js
In my user model, I have a webauthn_id, and several u2f devices, each of which has a public_key and a webauthn_id.
In my Rails app, I do:
options = WebAuthn::Credential.options_for_get(allow: :webauthn_id)
session[:webauthn_options] = options
In my javascript, I try to mimic the js file above and I do (this is embedded ruby):
options = <%= raw #options.as_json.to_json %>
options.challenge = WebAuthnHelpers.coerceToArrayBuffer(options.challenge);
options.allowCredentials = options.allowCredentials.map((c) => {
c.id = WebAuthnHelpers.coerceToArrayBuffer(c.id);
return c;
});
navigator.credentials.get({ "publicKey": options }).then(function (credentialInfoAssertion)
{
// send assertion response back to the server
// to proceed with the control of the credential
alert('here');
}).catch(function (err)
{
debugger
console.error(err); /* THIS IS WHERE THE ERROR IS THROWN */
});
The problem is, I cannot get past navigator.credentials.get, I get this error in the javascript console:
TypeError: CredentialsContainer.get: Element of 'allowCredentials' member of PublicKeyCredentialRequestOptions can't be converted to a dictionary
options at the time navigator.credentials.get is called looks like this:
I've tried every which way to convert my db-stored user and device variables into javascript properly encoded and parsed variables but cannot seem to get it to work. Anything obvious about what I'm doing wrong?
Thanks for any help,
Kevin
UPDATE -
Adding options json generated by the server:
"{\"challenge\":\"SSDYi4I7kRWt5wc5KjuAvgJ3dsQhjy7IPOJ0hvR5tMg\",\"timeout\":120000,\"allowCredentials\":[{\"type\":\"public-key\",\"id\":\"OUckfxGNLGGASUfGiX-1_8FzehlXh3fKvJ98tm59mVukJkKb_CGk1avnorL4sQQASVO9aGqmgn01jf629Jt0Z0SmBpDKd9sL1T5Z9loDrkLTTCIzrIRqhwPC6yrkfBFi\"},{\"type\":\"public-key\",\"id\":\"Fj5T-WPmEMTz139mY-Vo0DTfsNmjwy_mUx6jn5rUEPx-LsY51mxNYidprJ39_cHeAOieg-W12X47iJm42K0Tsixj4_Fl6KjdgYoxQtEYsNF-LPhwtoKwYsy1hZgVojp3\"}]}"
This is an example of the serialised JSON data returned by our implementation:
{
"challenge": "MQ1S8MBSU0M2kiJqJD8wnQ",
"timeout": 60000,
"rpId": "identity.acme.com",
"allowCredentials": [
{
"type": "public-key",
"id": "k5Ti8dLdko1GANsBT-_NZ5L_-8j_8TnoNOYe8mUcs4o",
"transports": [
"internal"
]
},
{
"type": "public-key",
"id": "LAqkKEO99XPCQ7fsUa3stz7K76A_mE5dQwX4S3QS6jdbI9ttSn9Hu37BA31JUGXqgyhTtskL5obe6uZxitbIfA",
"transports": [
"usb"
]
},
{
"type": "public-key",
"id": "nbN3S08Wv2GElRsW9AmK70J1INEpwIywQcOl6rp_DWLm4mcQiH96TmAXSrZRHciZBENVB9rJdE94HPHbeVjtZg",
"transports": [
"usb"
]
}
],
"userVerification": "discouraged",
"extensions": {
"txAuthSimple": "Sign in to your ACME account",
"exts": true,
"uvi": true,
"loc": true,
"uvm": true
}
}
This is parsed to an object and the code used to coerce those base64url encoded values is:
credentialRequestOptions.challenge = WebAuthnHelpers.coerceToArrayBuffer(credentialRequestOptions.challenge);
credentialRequestOptions.allowCredentials = credentialRequestOptions.allowCredentials.map((c) => {
c.id = WebAuthnHelpers.coerceToArrayBuffer(c.id);
return c;
});
Hope that helps. The JSON data is retreived via a fetch() call and the byte[] fields are encoded as base64url on the serverside.
I use Meteor to query a mongo collection. It has for example the following entry:
{
"_id": "uCfwxKXyZygcWQeiS",
"gameType": "foobar",
"state": "starting",
"hidden": {
"correctAnswer": "secret",
"someOtherStuff": "foobar"
},
"personal": {
"Y73uBhuDq2Bhk4d8W": {
"givenAnswer": "another secret",
},
"hQphob8s92gbEMXbY": {
"givenAnswer": "i have no clue"
}
}
}
What I am trying to do now is:
don't return the values behind "hidden"
from the "personal" embedded document only return the values for the asking user
In code it would look something like this:
Meteor.publish('game', function() {
this.related(function(user) {
var fields = {};
fields.hidden = 0;
fields.personal = 0;
fields['personal.' + this.userId] = 1;
return Games.find({}, {fields: fields});
}, Meteor.users.find(this.userId, {fields: {'profile.gameId': 1}}));
}
Obviously this won't work, because MongoDB won't allow mixed includes and excludes. On the other hand, I cannot switch to "specify only the included fields", because they can vary from gameType to gameType and it would become a large list.
I really hope that you can help me out of this. What can I do to solve the problem?
Typical example of where to use the directly controlled publication features (the this.added/removed/changed methods).
See the second example block a bit down the page at http://docs.meteor.com/api/pubsub.html#Meteor-publish.
With this pattern you get complete control of when and what to publish.
I'm using the calendar.events.insert API to add an Event to my Calendar via the PHP client.
The event is being inserted correctly along with appropriate values as set by the API.
The same however is not able to trigger an email invite to the attendees. I looked around to find that the request needs to set the param sendNotifications as true.
The same doesn't seem to help either.
Here is a sample code:
var request = gapi.client.calendar.events.insert({
"calendarId" : calendarData.id,
"sendNotifications": true,
"end": {
"dateTime": eventData.endTime
},
"start": {
"dateTime": eventData.startTime
},
"summary": eventData.eventName,
"attendees": jQuery.map(eventData.attendees, function(a) {
return {'email' : a};
}),
"reminders": {
"useDefault": false,
"overrides": [
{
"method": "email",
"minutes": 15
},
{
"method": "popup",
"minutes": 15
}
]
}
});
Where eventData and calendarData are appropriate objects.
Although my main problem is with email invites being sent the first time, I also tried (as can be seen above) to set a reminder (using overrides). While the popup works as expected, I didn't receive an email update in this case either.
This makes me wonder whether this may be a permission issue - something which I need to enable for my app perhaps (the user would understandably need to know if my app is sending emails on their behalf)?
In the Google API Documentation for inserting events, the "sendNotifications" option is actually a parameter. You might want to put it in the request parameters instead of the body.
In Meteor
Note: In my Meteor application, I did did the request by hand, and I'm still new to JavaScript. I'm not sure how you would do that in plain JavaScript or with the calendar API, so I'll just put the Meteor code, hope it helps although it's a bit off-topic.
var reqUrl = "https://www.googleapis.com/calendar/v3/calendars/primary/events";
var payload = {
'headers' : {
'Authorization': "Bearer " + token,
'Content-Type': 'application/json'
},
'params': {
'sendNotifications': true
},
'data': {
"summary": summary,
"location": "",
"start": {
"dateTime": start
},
"end": {
"dateTime": end
},
"attendees": [
{
"email": "*********#gmail.com"
}
]
}
};
Meteor.http.post(reqUrl, reqParams, function () {});
#linaa is correct. Just ran into this issue myself.
In JS, this would look like:
var request = gapi.client.calendar.events.insert(
sendNotifications: true,
{
// request body goes here
}
);
For this you should set the "remindOnRespondedEventsOnly" value to "true".
which means, Whether event reminders should be sent only for events with the user’s response status “Yes” and “Maybe”.
You can find this information here.
Hope that helps!
event = service.events().insert(calendarId='primary', body=event, sendUpdates='all').execute()
this will work
I'm currently trying to put something together with ember + emberdata + router + asp.net web api. Most of it seem to work, however I stuck in an error message I get when ember-data tries to findAll through the adapter for my models.
In my backend I have a model like this (C#):
public class Genre {
[Key]
public int Id { get; set; }
[Required]
[StringLength(50, MinimumLength=3)]
public string Name { get; set; }
}
Which in my app I represent it like this using ember-data:
App.Genre = DS.Model.extend({
id: DS.attr("number"),
name: DS.attr("string")
}).reopenClass({
url: 'api/genre'
});
I have also a Store defined in my App using the RESTAdapter like so:
App.store = DS.Store.create({
revision: 4,
adapter: DS.RESTAdapter.create({
bulkCommit: false
})
});
And the store is used in my controller as below:
App.GenreController = Ember.ArrayController.extend({
content: App.store.findAll(App.Genre),
selectedGenre: null
});
The router is defined as
App.router = Em.Router.create({
enableLogging: true,
location: 'hash',
root: Ember.Route.extend({
//...
genre: Em.Route.extend({
route: '/genre',
index: Ember.Route.extend({
connectOutlets: function (router, context) {
router.get('applicationController').connectOutlet('genre');
}
})
}),
//...
})
})
When I run my application, I get the following message for every object that has this same structure:
Uncaught Error: assertion failed: Your server returned a hash with the
key 0 but you have no mappings
For reference, here's the json the service is returning:
[
{
"id": 1,
"name": "Action"
},
{
"id": 2,
"name": "Drama"
},
{
"id": 3,
"name": "Comedy"
},
{
"id": 4,
"name": "Romance"
}
]
I cannot tell exactly what the problem is and since the assertion is mentioning that I need mapping, I'd like to know:
What this mapping is and how to use it.
Since the returned json is an array, should I be using a different type of controller in my app ,or is there anything I should know about when working with this type of json in ember-data? or should I change the JsonFormatter options in the server?
Any help is welcome.
I can definitely add more information if you feel this isn't enough to understand the problem.
EDIT: I've changed a few things in my backend and now my findAll() equivalent action in the server serializes the the output as the following json:
{
"genres": [
{ "id": 1, "name": "Action" },
{ "id": 2, "name": "Drama" },
{ "id": 3, "name": "Comedy" },
{ "id": 4, "name": "Romance" }
]
}
But I still can't get it to populate my models in the client and my error message has changed to this:
Uncaught Error: assertion failed: Your server returned a hash with the
key genres but you have no mappings
Not sure what else I might be doing wrong.
The method that throws this exception is sideload and checks for the mappings like this:
sideload: function (store, type, json, root) {
var sideloadedType, mappings, loaded = {};
loaded[root] = true;
for (var prop in json) {
if (!json.hasOwnProperty(prop)) { continue; }
if (prop === root) { continue; }
sideloadedType = type.typeForAssociation(prop);
if (!sideloadedType) {
mappings = get(this, 'mappings');
Ember.assert("Your server returned a hash with the key " + prop + " but you have no mappings", !!mappings);
//...
This call sideloadedType = type.typeForAssociation(prop); returns undefined and then I get the error message. The method typeForAssociation() checks for the for 'associationsByName' key which returns an empty Ember.Map.
Still no solution for this at the moment.
By the way...
My action is now like this:
// GET api/genres
public object GetGenres() {
return new { genres = context.Genres.AsQueryable() };
}
// GET api/genres
//[Queryable]
//public IQueryable<Genre> GetGenres()
//{
// return context.Genres.AsQueryable();
//}
I had to remove the original implementation which gets serialized by json.NET as I could not find config options to produce a json output as Ember-Data expects ( as in {resource_name : [json, json,...]}). Side effect of this is that I've lost built-in OData support, but I'd like to keep it. Does anyone know how could I configure it to produce different json for a collection?
The mapping can be defined in the DS.RESTAdapter. I think you could try to define something like this:
App.Store = DS.Store.extend({
adapter: DS.RESTAdapter.create({
bulkCommit: true,
mappings: {
genres: App.Genre
},
// you can also define plurals, if there is a unregular plural
// usually, RESTAdapter simply add a 's' for plurals.
// for example at work we have to define something like this
plurals: {
business_process: 'business_processes'
//else it tries to fetch business_processs
}
}),
revision: 4
});
Hope this resolves your problem.
Update:
At this time, this is not well documented, I don't remember if we found it by ourself reading the code, or perhaps Tom Dale pointed on it.
Anyway, here is the point for plurals
For the mappings, I think we were driven by the same error as you, and either we tried, either Tom teached us about this.
The RESTAdapter expects the returned JSON to be of the form:
{
"genres": [{
"id": 1,
"name": "action"
},{
"id": 2,
"name": "Drama"
}]
}
The tests are a good source of documentation, see https://github.com/emberjs/data/blob/master/packages/ember-data/tests/unit/rest_adapter_test.js#L315-329
I'm using Ember Data rev. 11 and it seems that the plurals config in DS.RESTAdapter.create never works. I looked into the codes and found a solution as following:
App.Adapter = DS.RESTAdapter.extend({
bulkCommit: false
})
App.Adapter.configure('plurals', {
series: 'series'
})