I'm trying to convert a Parse.com object into an image with Javascript. The object is of the type Bytes, and I can't convert it in any way. Can someone give me an example of how to turn it into a functional URL? I try to access it but it just keeps crashing.
Edit:
getURL: function () {
var query = new Parse.Query("Token");
query.limit(20);
query.find().then(this.handleCallback.bind(this));
},
handleCallback: function (objects) {
for (var i = 0; i < objects.length; i++) {
this.tokenSearch[i] = {
imgURL: "data:image/png;base64," + objects[i].get("pic")
};
}
}
I have tried objects[i].get("pic").url(), objects[i].get("pic").toString('base64') and some other stuff, but it won't work!
If you want to store images and later have a URL you can load, you should switch to the File data type instead of pushing just a byte array. The documentation you linked recommends the following:
There are two ways to store binary data. The Bytes type allows you to associate NSData/bytes[] types directly on a PFObject. This is recommended only for small pieces of binary-encoded data. For actual files (images, documents, etc.), the File type can be used by instantiating a PFFile/ParseFile and setting it on a field.
You have stored the actual byte array into the pic column of your Parse class. Therefore, that's all you get. If you chose instead to store it as a File (PFFile in iOS/OSX, ParseFile in Android), you could do what you're trying to do. As it is, Parse is merely storing the bytes as it would store any other data, such as a Date or a String. There's no URL data associated with it.
Here's how you can create and upload a PFFile from an iOS device:
- (void)updateloadImage:(UIImage *)image {
PFFile *file = [PFFile fileWithData:UIImagePNGRepresentation(image)];
PFObject *newToken = [PFObject objectWithClassName:#"Token"];
newToken[#"pic"] = file;
[newToken saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
// Do real error handling here
// Now, you have :
// 1. The "pic" saved to Parse as a PFFile
// 2. The "token" saved to Parse as a PFObject
// 3. The "pic" stored as a property on the "token"
}
];
}
Once you have it stored that way on Parse, you would access it in the JavaScript SDK something like this:
function getURL() {
var query = new Parse.Query("Token");
query.limit(20);
query.find().then(function (objects) {
for (var i = 0; i < objects.length; i = i + 1) {
this.tokenSearch[i] = {imgURL: objects[i].get("pic").url()};
}
});
}
Related
Im writing a function to export data pulled from C# to a csv file in JavaScript. The data is trying to be passed into an active webpage with live data. The goal is to export said data from C# string to the clients window hence the transfer to JavaScript (if my understanding is right). Then the JavaScript function in the page will download to the users window.
My issue is why am I have the invalid export data, please provide an array of objects output when I think I provide it one?
C# Code:
protected string test()
{
MyRequest request = new MyRequest();
request.Id = "TEST";
// Cant post this class but this is what I do
var tmp = new ExistingClassOfMine();
// pulls the data from the existing class through GetResponse();
// Then convert to json string
string data = JsonConvert.SerializeObject(tmp.GetResponse(request));
// Data found from breakpoint: {"Slot1":[{"Name":"TJ", "ID":"123"},{"Name":"Joe","ID:456"}], "TotalCount":2}
return data;
}
JavaScript:
function exportToExcel() {
console.log("Exporting to Excel");
isExporting = true;
const fileName = 'NameList';
const exportType = 'csv';
var data = <%=this.test()%>;
var readData = data["Slot1"];
var myArray = [];
// Followed a stack article to create this
for (var i in readData) {
myArray.push(readData[i]);
}
// Logging stuff for debugging
console.log("Data from C#");
console.log(data);
console.log(typeof (data));
console.log("Data we want to export");
console.log(readData);
console.log("Parsed Data?");
console.log(myArray);
// fails here with the error Invalid export data, please provide an array of objects
// which I thought I did with my for loop up a few lines
window.exportFromJSON({myArray, fileName, exportType});
isExporting = false;
}
function onRequestStart(sender, args) {
if (isExporting) {
args.set_enableAjax(false);
}
}
Log output:
Look, your triying to change the name of the key which is sent to exportFromJSON. The right way to name the key is data no myArray.
Change:
myArray
to
data
or assign myArray to data key
window.exportFromJSON({data:myArray, fileName, exportType});
This is the code I'm saving my jSignature value in my database:
var datapair = $('.sign-wrapper').jSignature();
datapair.bind('change', function(e){
var data = datapair.jSignature("getData", "svgbase64");
var i = new Image()
i.src = "data:" + data[0] + "," + data[1];
$(this).prev('input').val(i.src);
});
So I'm having following value in my DB:
data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0iVVRGLTgiIHN0YW5kYWxvbmU9Im5vIj8+PCFET0NUWVBFIHN2ZyBQVUJMSUMgIi0vL1czQy8vRFREIFNWRyAxLjEvL0VOIiAiaHR0cDovL3d3dy53My5vcmcvR3JhcGhpY3MvU1ZHLzEuMS9EVEQvc3ZnMTEuZHRkIj48c3ZnIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgdmVyc2lvbj0iMS4xIiB3aWR0aD0iODkiIGhlaWdodD0iMTEyIj48cGF0aCBmaWxsPSJub25lIiBzdHJva2U9IiMwMDAwMDAiIHN0cm9rZS13aWR0aD0iMiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIiBkPSJNIDEgMTExIGMgMC4wNCAtMC4xNCAxLjAyIC01LjQ1IDIgLTggYyAyLjM0IC02LjA4IDUuOSAtMTEuODcgOCAtMTggYyAxLjg2IC01LjQ0IDMuMDkgLTExLjIyIDQgLTE3IGMgMS4xIC02Ljk4IDAuODggLTEzLjk4IDIgLTIxIGMgMS4yNSAtNy44MyAzLjA0IC0xNS4zOSA1IC0yMyBjIDAuNzEgLTIuNzYgMS43NiAtNS41MiAzIC04IGMgMS4wNSAtMi4wOSAyLjUyIC00LjA4IDQgLTYgYyAxLjg5IC0yLjQ2IDMuOTIgLTQuOTIgNiAtNyBjIDAuODEgLTAuODEgMi4yNiAtMi4yMiAzIC0yIGMgMS43NyAwLjUzIDUuMzcgMyA3IDUgYyAyLjM3IDIuOTEgNC40NiA3LjEzIDYgMTEgYyA1LjQ2IDEzLjc5IDkuMzIgMjguNTIgMTUgNDIgYyAyLjIzIDUuMjkgNi4zOCA5Ljc2IDkgMTUgYyAzLjQyIDYuODQgNS43NiAxNC43NiA5IDIxIGwgNCA0Ii8+PHBhdGggZmlsbD0ibm9uZSIgc3Ryb2tlPSIjMDAwMDAwIiBzdHJva2Utd2lkdGg9IjIiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIgZD0iTSAyMCA2NCBjIDAuMTEgLTAuMTIgMy43MyAtNS40IDYgLTcgYyAyLjk4IC0yLjEgNy4yMSAtMy44NSAxMSAtNSBjIDcuMDcgLTIuMTQgMTQuNzggLTMuNzEgMjIgLTUgYyAxLjkyIC0wLjM0IDYgMCA2IDAiLz48L3N2Zz4=
So while reloading the page I'm having the above base64 image data url in a hidden variable.
I tried to load the data to the existing signature canvas using below code:
// If value exist load existing
var src = $(this).prev('input').val();
if(src != '') {
datapair.jSignature("importData", src);
}
But it is not working and I am getting following error in console:
Uncaught Error: jSignature is unable to find import plugin with for
format 'image/svg+xml;base64'
The problem occurs when your data is not recognized by the plugin.
The reason why it throws an error is, the format 'image/svg+xml;base64' is not supported by the importer, there is only an exporter support for it.
Reference:
https://github.com/brinley/jSignature#data-import--export-and-plugins
I suggest you store the data using base30 because there is support for both export and import.
var data = datapair.jSignature("getData", "base30 ");
setData takes two arguments - data object, data format name. When data object is a string formatted in data-url pattern you don't need to specify the data dormat name. The data format name (mime) will be implied from the data-url prefix. See example below for that. Returns (in a traditional jQuery chainable way) jQuery object ref to the element onto which the plugin was applied.
You can store the value in base64 and load it. I load the value in base64 in an input and through Jquery I load it in the object.
// Creo el objeto de firma.
$("#jq_sig").jSignature
({
// width/height of signature pad
width: 497,
height: 200,
// Format bootstrap 4
cssclass: "bg-light border",
lineWidth: 2,
color: "black",
});
// I take the value of the form field that I previously loaded with PHP.
var firma = $('#form input[name=firma]').val();
if (firma.length > 0)
{
$("#jq_sig").jSignature("setData", 'data:image/png;base64,' + firma);
console.log('Signature loaded.');
}
else
{
console.log('Signature not loaded.');
}
You can save to your database in any format. Then you tell 'setdata' what kind of format it is.
I'm working on a CSV parsing web application, which collects data and then uses it to draw a plot graph. So far it works nicely, but unfortunately it takes some time to parse the CSV files with papaparse, even though they are only about 3MB.
So it would be nice to have some kind of progress shown, when "papa" is working. I could go for the cheap hidden div, showing "I'm working", but would prefer the use of <progress>.
Unfortunately the bar just gets updated AFTER papa has finished its work. So I tried to get into webworkers and use a worker file to calculate progress and also setting worker: true in Papa Parses configuration. Still no avail.
The used configuration (with step function) is as followed:
var papaConfig =
{
header: true,
dynamicTyping: true,
worker: true,
step: function (row) {
if (gotHeaders == false) {
for (k in row.data[0]) {
if (k != "Time" && k != "Date" && k != " Time" && k != " ") {
header.push(k);
var obj = {};
obj.label = k;
obj.data = [];
flotData.push(obj);
gotHeaders = true;
}
}
}
tempDate = row.data[0]["Date"];
tempTime = row.data[0][" Time"];
var tD = tempDate.split(".");
var tT = tempTime.split(":");
tT[0] = tT[0].replace(" ", "");
dateTime = new Date(tD[2], tD[1] - 1, tD[0], tT[0], tT[1], tT[2]);
var encoded = $.toJSON(row.data[0]);
for (j = 0; j < header.length; j++) {
var value = $.evalJSON(encoded)[header[j]]
flotData[j].data.push([dateTime, value]);
}
w.postMessage({ state: row.meta.cursor, size: size });
},
complete: Done,
}
Worker configuration on the main site:
var w = new Worker("js/workers.js");
w.onmessage = function (event) {
$("#progBar").val(event.data);
};
and the called worker is:
onmessage = function(e) {
var progress = e.data.state;
var size = e.data.size;
var newPercent = Math.round(progress / size * 100);
postMessage(newPercent);
}
The progress bar is updated, but only after the CSV file is parsed and the site is set up with data, so the worker is called, but the answer is handled after parsing. Papa Parse seems to be called in a worker, too. Or so it seems if checking the calls in the browsers debugging tools, but still the site is unresponsive, until all data shows up.
Can anyone point me to what I have done wrong, or where to adjust the code, to get a working progress bar? I guess this would also deepen my understanding of web workers.
You could use the FileReader API to read the file as text, split the string by "\n" and then count the length of the returned array. This is then your size variable for the calculation of percentage.
You can then pass the file string to Papa (you do not need to reread directly from the file) and pass the number of rows (the size variable) to your worker. (I am unfamiliar with workers and so am unsure how you do this.)
Obviously this only accurately works if there are no embedded line breaks inside the csv file (e.g. where a string is spread over several lines with line breaks) as these will count as extra rows, so you will not make it to 100%. Not a fatal error, but may look strange to the user if it always seems to finish before 100%.
Here is some sample code to give you ideas.
var size = 0;
function loadFile(){
var files = document.getElementById("file").files; //load file from file input
var file = files[0];
var reader = new FileReader();
reader.readAsText(file);
reader.onload = function(event){
var csv = event.target.result; //the string version of your csv.
var csvArray = csv.split("\n");
size = csvArray.length;
console.log(size); //returns the number of rows in your file.
Papa.parse(csv, papaConfig); //Send the csv string to Papa for parsing.
};
}
I haven't used Papa Parse with workers before, but a few things pop up after playing with it for a bit:
It does not seem to expect you to interact directly with the worker
It expects you to either want the entire final result, or the individual items
Using a web worker makes providing a JS Fiddle infeasible, but here's some HTML that demonstrates the second point:
<html>
<head>
<script src="papaparse.js"></script>
</head>
<body>
<div id="step">
</div>
<div id="result">
</div>
<script type="application/javascript">
var papaConfig = {
header: true,
worker: true,
step: function (row) {
var stepDiv = document.getElementById('step');
stepDiv.appendChild(document.createTextNode('Step received: ' + JSON.stringify(row)));
stepDiv.appendChild(document.createElement('hr'));
},
complete: function (result) {
var resultDiv = document.getElementById('result');
resultDiv.appendChild(document.createElement('hr'));
resultDiv.appendChild(document.createTextNode('Complete received: ' + JSON.stringify(result)))
resultDiv.appendChild(document.createElement('hr'));
}
};
var data = 'Column 1,Column 2,Column 3,Column 4 \n\
1-1,1-2,1-3,1-4 \n\
2-1,2-2,2-3,2-4 \n\
3-1,3-2,3-3,3-4 \n\
4,5,6,7';
Papa.parse(data, papaConfig);
</script>
</body>
</html>
If you run this locally, you'll see you get a line for each of the four rows of the CSV data, but the call to the complete callback gets undefined. Something like:
Step received: {"data":[{"Column 1":"1-1",...
Step received: {"data":[{"Column 1":"2-1",...
Step received: {"data":[{"Column 1":"3-1",...
Step received: {"data":[{"Column 1":"4","...
Complete received: undefined
However if you remove or comment out the step function, you will get a single line for all four results:
Complete received: {"data":[{"Column 1":"1-1",...
Note also that Papa Parse uses a streaming concept to support the step callback regardless of using a worker or not. This means you won't know how many items you are parsing directly, so calculating the percent complete is not possible unless you can find the length of items separately.
I want to write an Angular module that saves data locally (with IndexedDB) and sync that data with server data via a RESTful service.
I have build a rudimental base for that already that can get data to the server and put it into the IndexedDB.
Because the web application has to run in offline mode too, I have chosen to use two keys for the object stores. A local ID which is autoIncremented and the ID of the entry on the server.
The server isn't aware of the local ID and the local data entry might not know the server ID if it can't be transmitted to the server
When I define an unique index for the server ID, the local entries won't get updated if the server ID already exists and the update process stops.
Is there a way to do this directly with the IDB API?
I have found a similar problem, but with the an simple cursor and cursor.update solution, there is no possibility to insert new data from the server to the local DB:
Indexeddb - Update record by index key
I have found a way to do this. I will handle the local data (with an local id) separately from the server data (no local id, because server doesn't know it).
Then i will update the local data with IDBObjectStore.put and check if the data that came from the server (server id is set, but no local id) is already saved locally with separate IDBIndex.openCursor calls. If local data was found with the given serverId, a new entry will be added with put IDBObjectStore.put, and if found the cursor on that entry will be updated with the server data (old localId will be carried over).
// var _db => connected database
// filter the data that came from the server (no local id defined but server id is)
var serverData = [];
for (var i = 0; i < data.length; i++) {
if (data[i]._localId === undefined && data[i].serverId !== undefined && data[i].serverId !== null) {
// remove the server data object and add it to the server data array
serverData = serverData.concat(data.splice(i, 1));
i--;
}
}
var transaction = _db.transaction(_storageName, 'readwrite');
transaction.oncomplete = function() {
// do something on completion
};
transaction.onerror = function(e) {
// do something when an error occurs
};
var objectStore = transaction.objectStore(_storageName);
// Add local data to the database (no server id)
// local id can be existing or not (new entry)
for (var i = 0; i < data.length; i++) {
objectStore.put(data[i]).onsuccess = function(e) {
// do something when successfully added
};
}
// Add data from the server to the database
var index = objectStore.index('serverId'); // server id index for searching
// go through all data from the server
for (var i = 0; i < serverData.length; i++) {
(function(){
var serverItem = serverData[i];
// search for an existing entry in the local database
var checkRequest = index.openCursor(IDBKeyRange.only(serverItem.serverId));
checkRequest.onsuccess = function (e) {
var cursor = e.target.result;
// If item was not found in local indexedDB storage...
if (cursor === null) {
// add new item to storage
this.source.objectStore.put(serverItem).onsuccess = function(e) {
// do something when successfully added
};
// Item was found locally
} else {
var dbItem = cursor.value;
// set local id of the added item to the one from the old local entry
serverItem.localId = dbItem.localId;
// update found local entry with the one from the server
cursor.update(serverItem).onsuccess = function(e) {
// do something on success
};
}
};
})();
}
There could be a more elegant solution, but this one is the first one that I came up with, that worked. I would be thankful for any improvement or better solutions.
I'm trying to retrieve the filename of gists on Github, from the data obtained from Github's API. I'm using javascript to access the data.
An example result can be found here: https://api.github.com/users/blaercom/gists. I've also copied a shortened version of the data below.
{
...
id: "4468273",
files: {
firstpost.md: {
type: "text/plain",
filename: "firstpost.md",
size: 16,
language: "Markdown"
}
}
...
}
I've been trying many things, but I can't seem to access the filename. This is because the 'files' object is not a list, but a key-value pair, where the key identifier matches the filename itself.
Things I've tried include
filename = files[0]['filename']
filename = files[0].filename
filename = files['filename']
Frankly, the only methods that work are variations of filename = files['firstpost.md']['filename'] but this isn't valid since I can not know the filename beforehand.
I'm sure it is possible to access the filename, but have been spending quite a while testing different methods. Help would be appreciated!
You can use for (var key in object) {}, here is an example using the data from your api call:
var filenames = [];
for (var filename in data[0].files) {
filenames.push(filename);
}
console.log(filenames); // ["firstpost.md"]
Here is a real example using your json response
var obj='your json data';
var fileNames=[];
for(var i in obj[0]['files'])
{
var fileName=obj[0]['files'][i]['filename'];
fileNames.push(fileName);
}
document.write(fileNames[0]); // firstpost.md
Example.
Update:
Another example using jsonp/script.
<script src="https://api.github.com/users/blaercom/gists?callback=myCallback"></script>
The callback function
function myCallback(o)
{
var obj=o.data;
var fileNames=[];
for(var i in obj[0]['files'])
{
var fileName=obj[0]['files'][i]['filename'];
fileNames.push(fileName);
}
document.write(fileNames[0]); // firstpost.md
}