Meteor FS.Collection image doesn't render on upload - javascript

Whenever I upload an image with FS.Collection using FileSystem, the image doesn't renders and I get a 503 error. But if I refresh the page the image will render and I get no error. So I had to to set that images path to a public folder using ostrio:meteor-root, so whenever I upload an image, the page refreshes. I'm fetching the image URL from a Mongo.Collection, not from my FS.Collection.
When I upload an image I store the url on Adverts:
"_id" : "knCMZPK8RrY5Y7GQo",
"reference" : 102020026,
"pics" : {
"pic" : [
{
"url" : "http://localhost:3000/cfs/files/Images/6fHhBT3ky5iAJnQfw"
}
]
},
Images.js:
var imageStore = new FS.Store.FileSystem("Images", {
path: Meteor.absolutePath + '/public/uploads'
})
Images = new FS.Collection("Images", {
stores: [imageStore],
filter: {
allow: {
contentTypes: ['image/*']
},
}
});
if (Meteor.isServer) {
Images.allow({
insert: function(userId, party) {
return true;
},
update: function(userId, party) {
return true;
},
download: function(userId, party) {
return true;
},
remove: function(userId, party) {
return true;
}
});
}
Image not rendering:

ANSWER:
Instead of using FS collection I switched to ostrio:files.
Since I wanted to just upload the images to the document that I was updating (I have a reactive-table with my collection data, each row is clickable and contains a document, when I click a row I use iron:router to navigate to a page with AutoForm that updates that single document) I make a Meteor.call in the client to send the document _id to the server.
Meteor.call('docId', this._id)
It seemed like the image wasn't rendering because it was creating the thumbnail before finishing the upload.
So to fix this I made a afterUpload call back server-side on docId Method:
FSCollection.off('afterUpload')
FSCollection.on('afterUpload', function(fileRef){
Update Mongo.Collection (...)
}
If I don't use .off before .on it keeps incrementing the time that the code inside the callbak is executed. When the first image is uploaded the code executes correctly, when the second one is uploaded the code is executed twice and so on.

Related

PouchDB delete data on device without affecting remote sync

Right now I am replicating my entire device database over to my remote database.
Once that is complete, I grab all my data that is not older than 1 month from my remote database, using a filter, and bring it to my device.
FILTER
{
_id: '_design/filters',
"filters": {
"device": function(doc, req) {
if(doc.type == "document" || doc.type == "signature") {
if(doc.created >= req.query.date) return true;
else return false;
}
else return true;
}
}
}
REPLICATION
device_db.replicate.to(remote_db)
.on('complete', function () {
device_db.replicate.from(remote_db, {
filter: "filters/device",
query_params: { "date": (Math.floor(Date.now() / 1000)-2419200) }
})
.on('complete', function () {
console.log("localtoRemoteSync replicate.to success");
callback(true);
});
});
My question:
I want to be able to periodically delete data from my device that is older than 3 months (old enough data where I already know it's been sync'd)
But just because I delete it from my device, when I replicate the data back to my remote_db, I don't want it to be deleted on there too.
How can I delete specific data on my device but not have that deletion translated when I replicate?
FILTERS
Here, we have 2 filters:
noDeleted : This filter doesn't push _deleted documents.
device : Filter to get the latest data only.
{
_id: '_design/filters',
"filters": {
"device": function(doc, req) {
if (doc.type == "document" || doc.type == "signature") {
if (doc.created >= req.query.date) return true;
else return false;
}
return true;
},
"noDeleted": function(doc, req) {
//Document _deleted won't pass through this filter.
//If we delete the document locally, the delete won't be replicated to the remote DB
return !doc._deleted;
}
}
}
REPLICATION
device_db.replicate.to(remote_db, {
filter: "filters/noDeleted"
})
.on('complete', function() {
device_db.replicate.from(remote_db, {
filter: "filters/device",
query_params: { "date": (Math.floor(Date.now() / 1000) - 2419200) }
})
.on('complete', function() {
console.log("localtoRemoteSync replicate.to success");
callback(true);
});
});
Workflow
You push all your documents without pushing the deleted document.
You get all the updates for the latest data
You delete your old documents
You could either query the remote DB to get the ids of the documents that are too old and delete them locally. Note that the documents will still be there as _deleted. To completely remove them, a compaction will be required.
You could also totally destroy your local database after step1 and start from scratch.
callback(true);
Add a one-way filtered replication. However anything you need back on the server you will need to use a put request with the server's _rev.
For example
Replicate from server to client, then add a filter mechanism, like transfer:true to the docs you want to replicate. replication
db.replicate.from(remoteDB, {
live: true,
retry: true,
selector: {transfer:true}// or any other type of selector
});
To delete a doc on the client, set transfer to false, then delete it on the client. it won't meet your filter criteria so it won't replicate.
Anything you want to put back to the server use a put request instead of replicate.
If you want the document back on the client just set transfer to true in the doc.

How do you call arbitrary functions using addRules with declarativeContent listeners in Chrome Extensions?

I am trying to change an extension I'm working on from a browser action to a page action. Currently, the extension detects a tab change and, if the URL of the tab is on our domains, makes a request for JSON data for status information, so it can make the good icon or the bad icon appear as a status indicator. My browser action code was this:
chrome.tabs.onActivated.addListener(function(activeInfo) {
chrome.tabs.get(activeInfo.tabId, function(tab) {
var url = tab.url,
matches = url.match(/(domain1)|(domain2)\.com/g);
if (url && matches) {
// Query for JSON data and change the icon based on it.
loadReach(url);
} else {
// Change the icon bad
}
});
});
I have the basic listeners in place to insert the declarativeContent listener and show the initial icon, but am unsure where to put my callback that makes the query for the JSON data:
// When the extension is installed or upgraded ...
chrome.runtime.onInstalled.addListener(function() {
// Replace all rules ...
chrome.declarativeContent.onPageChanged.removeRules(undefined, function() {
// With a new rule ...
chrome.declarativeContent.onPageChanged.addRules([
{
conditions: [
new chrome.declarativeContent.PageStateMatcher({
pageUrl: { hostContains: 'domain1.com' },
}),
new chrome.declarativeContent.PageStateMatcher({
pageUrl: { hostContains: 'domain2.com' },
})
],
// And shows the extension's page action.
actions: [ new chrome.declarativeContent.ShowPageAction() ]
}
]);
});
});
Where in the second codeblock would I be able to run that callback, or is it not supported by this method?

backbone.js - model.save() generate wrong PUT url path

I hate to ask these strange problems but couldn't able to avoid this one.
I have "Option" view with "Option" model passed as a parameter when creating.
var optionView = new OptionView({ model: option });
this.$el.find('div#optionsBoard').append( optionView.render().el );
In this view, when the user clicks on "Vote" button, the "voteCount" attribute of the model will be incremented.
events: { 'click .button-vote': 'processVote' },
processVote: function (e) {
var voteCounted = this.model.get('voteCount');
this.model.set('voteCount', voteCounted++);
console.log(this.model.id); // has a Id value
console.log(this.model.isNew()); // false
this.model.save(); // occurs problem here
e.preventDefault();
},
The problem occurs when I save the model back to the server as following:
PUT http://localhost:13791/api/options/ 404 (Not Found)
Yes, this url actually isn't existed on my REST API server. But I believe the correct path of PUT URL to update the model should be as following:
PUT http://localhost:13791/api/options/id_of_the_entity_to_be_updated
When I test this PUT url (http://localhost:13791/api/options/id_of_the_entity_to_be_updated) with Postman Rest client, it works perfectly.
So I think the problem occurs because Backbone model.save() method does not add the id_of_the_entity_to_be_updated to the PUT url.
Please, suggest me something how should I solve this problem.
As additional description, this is my "option" model setup code.
define([
'backbone'
], function (Backbone) {
var Option = Backbone.Model.extend({
idAttribute: "_id",
defaults: {
name: '',
location: '',
link: '',
voteCount: 0,
expiredDate: Date.now(),
imageName: ''
},
url: '/api/options/',
readFile: function(file) {
var reader = new FileReader();
// closure to capture the file information.
reader.onload = ( function(theFile, that) {
return function(e) {
that.set({filename: theFile.name, data: e.target.result});
that.set({imageUrl: theFile.name});
console.log(e.target.result);
};
})(file, this);
// Read in the image file as a data URL.
reader.readAsDataURL(file);
}
});
return Option;
});
Problem found
My bad. In the "Option" model setup, it should be "urlRoot" instead of "url".
In your model you should use urlRoot instead url:
urlRoot: '/api/options/'

javascript file cordova.. how to pass value through variable

I am developing intel cordova app...
to download files from server , i have included cordova file download plugin but it has data that i want to pass through variable...
here is my code:
var app = {
fileName: "PointerEventsCordovaPlugin.wmv", //<-- pass this value through variable (dynamic)
uriString: "http://media.ch9.ms/ch9/8c03/f4fe2512-59e5-4a07-bded-124b06ac8c03/PointerEventsCordovaPlugin.wmv", // <-- this one also
// Application Constructor
initialize: function() {
this.bindEvents();
},
....
I have added fileName and uristring.. but i want to add that value dynamically from variable.. how can i do this?????
cordova plugin link
please if you know anything about this than reply...
Following the example from the link you provided, remove the fileName and uriString fields from the app object, and parameterize the needed functions. For example, startDownload will become:
startDownload: function (fileName, uriString) {
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, function(fileSystem) {
fileSystem.root.getFile(fileName, { create: true }, function (newFile) {
app.downloadFile(uriString, newFile);
});
});
},

Plupload: perform custom check before starting upload

I have an MVC 5 view with a form and a plupload file uploader section. Upload is triggered by a button on the form. I have no problem uploading file chunks to the server and setting the parameters to the query string and all, but what I do have a problem with is starting the upload only after a custom sanity check has been performed.
Here's what I have tried:
var uploader = new plupload.Uploader({
runtimes: 'html5',
drop_element: 'upload',
browse_button: 'browse',
url: "../UploadFile",
chunk_size: "1024kb",
multipart_params: { "uid": "uid", "chunk": "chunk", "chunks": "chunks", "name": "name" },
init: {
PostInit: function(file) {
document.getElementById("filelist").innerHTML = "";
document.getElementById('submit-all').onclick = function () {
document.getElementById("infoPopup").style.visibility = "visible";
document.getElementById('submit-all').enabled = false;
var uuid = Math.uuidFast();
document.getElementById("uid").value = uuid;
uploader.settings.multipart_params = { uid: uuid, chunk: file.chunk, chunks: file.chunks, name: file.name };
if (checkReq) {
uploader.start();
}
return false;
};
},
The crucial part here is this:
if(checkReq){
uploader.start();
}
"checkReq" is my custom sanity check script that verifies that form values are not nonsensical (e.g. single form entries might be perfectly valid while in combination they are simply wrong, etc.).
So the above does not prevent the upload, the check script is not even fired, Firebug console output shows no error.
Since googling tells me that there is also a "BeforeUpload" event, I tried this:
BeforeUpload: function(up, file) {
if (checkReq) {
up.stop();
return false;
}
return true;
},
Which also does not seem to fire at all.
Edit: Next attempt, I put the call to my checkReq fuction into BeforeUpload in "preinit", which should fire before any chunking etc is done, so before the upload is prepared. This also failed although I have no idea why it does not fire:
var uploader = new plupload.Uploader({
runtimes: 'html5',
drop_element: 'upload',
browse_button: 'browse',
url: "../UploadFile",
chunk_size: "1024kb",
multipart_params: { "uid": "uid", "chunk": "chunk", "chunks": "chunks", "name": "name" },
preinit: {
BeforeUpload: function (up) {
if (checkReq) {
uploader.stop();
uploader.splice(0, uploader.files.length);
return false;
}
return true;
}
},
init: {
PostInit: function(file) {
...
I had used "dropzone.js" before, and my script worked fine with that but I found that I needed chunked uploads so I had to move to plupload and now my script is being ignored.
Could someone please tell me where I am being stupid here? Thanks!
Got it solved.
It's a nasty, ugly hack, but it works:
Made the "actual" submit/upload button hidden
Made a second button that acts as pre-submit button with onclick function
onclick function calls checkReq and if that returns true, the function calls the click() function of the "actual" submit/upload button
Like I said: nasty but it works.

Categories