Getting a file URL from Sanity out of a blocks array - javascript

I'm building a website using Next JS and Sanity for the CMS. Sanity has built-in schemas for images but not for video, so a video needs to be uploaded with the File schema. The docs suggest that to get a file URL to be used on the front-end you should use the query language GROQ to make this conversion at the request like so:
// GROQ query
*[_type == 'movie'] {
title,
"manuscriptURL": manuscript.asset->url
}
But since I am using the File schema to embed short auto-looping videos into rich text content using the Blocks schema, I don't have the luxury of converting URLs at the request and need to do it dynamically as the blocks array data is being parsed for the #portabletext/react component.
Basically, what I get back for the file is simply an asset reference with the following data:
{
"_type": "file",
"asset": {
"_ref": "file-e4e61f3b231cca8e3339e96e050aee428009c777-gif",
"_type": "reference"
}
}
When I then use Sanity's own #sanity/asset-utils package to get a file URL using their buildFileUrl() function, I get a URL that is undefined for that asset where PROJECT_ID and DATASET are the correct values:
https://cdn.sanity.io/files/[PROJECT_ID]/[DATASET]/undefined.undefined
Here is the function I made, using their package's file URL function, to get the asset URL, which returns the URL above with the undefined values:
export function getSanityFileUrl(sanityFile) {
const fileUrl = buildFileUrl(sanityFile.asset, {projectId: sanityConfig.projectId, dataset: sanityConfig.dataset})
console.log(fileUrl)
}
Thanks and anything helps!

I found the solution. The buildFileUrl() function exported by #sanity/asset-utils expects a different asset object. Instead, a user in this situation should use the getFileAsset() function which can accept a reference to the file.

Related

Is there a way to Post an array to web api or mvc controller and get a file back to download as a result?

I use an html table where it's content can be changed with mouse drag and drop implemented. Technically, you can move the data from any table cell to another. The table size 50 row * 10 column with each cell given a unique identifier. I want to export it to .xlsx format with C# EPPlus library, and give back the exported file to client.
So I need the pass the whole table data upon a button press and post it to either a web api or an mvc controller, create an excel file (like the original html table data) and send it back to download with browser.
So the idea is to create an array which contains each of table cell's value ( of course there should be empty cells in that array), and post that array to controller.
The problem with that approach lies in the download, if I call the api or mvc controller with regular jquery's ajax.post it did not recognize the response as a file.
C# code after ajax post:
[HttpPost]
public IHttpActionResult PostSavedReportExcel([FromBody]List<SavedReports> savedReports, [FromUri] string dateid)
{
//some excel creation code
HttpResponseMessage response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StreamContent(new MemoryStream(package.GetAsByteArray()))
};
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/vnd.openxmlformats-officedocument.spreadsheetml.sheet");
response.Content.Headers.ContentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("attachment")
{
FileName = dateid + "_report.xlsx"
};
ResponseMessageResult responseMessageResult = ResponseMessage(response);
return responseMessageResult;
}
Usually, for this kind of result I could use window.location = myurltocontroller to download properly , but that is only for GET requests, POST anything is not possible.
I found some answers which could help me in this topic:
JavaScript post request like a form submit
This points out I should go with creating a form, which passes the values, but I do not know how to do so in case of arrays (the table consists 50*10 = 500 values which I have to pass in the form)
I tried some only frontend solutions to the html-excel export problem, which of course does not require to build files on api side, but free jquery add-ins are deprecated, not customizeable, handle only .xls formats, etc.
I found EPPlus nuget package a highly customizeable tool, that is why I want to try this is at first place.
So the question is: how can I post an array of 500 elements, that the controller will recognize, generate the file, and make it automatically download from browser?
If you can provide some code that would be fantastic, but giving me the right direction is also helpful.
Thank you.
You can use fetch() (docs) to send the request from the JS frontend. When the browser (JS) has received the response, it can then offer its binary content as a download. Something like this:
fetch("http://your-api/convert-to-excel", // Send the POST request to the Backend
{
method:"POST",
body: JSON.stringify(
[[1,2],[3,4]] // Here you can put your matrix
)
})
.then(response => response.blob())
.then(blob => {
// Put the response BLOB into a virtual download from JS
if (navigator.appVersion.toString().indexOf('.NET') > 0) {
window.navigator.msSaveBlob(blob, "my-excel-export.xlsx");
} else {
var a = window.document.createElement('a');
a.href = URL.createObjectURL(blob);
a.download = "my-excel-export.xlsx";
a.click();
}});
So the JS part of the browser actually first downloads the file behind the scenes, and only when it's done, it's triggering the "download" from the browsers memory into a file on the HD.
This is a quite common scenario with REST APIs that require bearer token authentication.

Piping data from a file to a rendered page in Sails.js

My application needs to read in a large dataset and pipe it to the client to manipulate with D3.js. The problem is, on large datasets, the reading/loading of the file contents could take a while. I want to solve this using streams. However, I'm unsure of how to do so in the context of the Sails framework.
What I want to do is read the contents of the file and pipe it to a rendered page. However, I can't figure out how to pipe it through if I use something like res.view('somePage', { data: thePipedData });.
I currently have something like this:
var datastream = fs.createReadStream(path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv'));
datastream.pipe(res);
...
return res.view('analytics', { title: 'Analytics', data: ??? });
What's the best way to approach this?
Based on your example it seems like the best course of action would be to set up a separate endpoint to serve just the data, and include it on the client via a regular <script> tag.
MyDataController.js
getData: function(req, res) {
/* Some code here to determine datatype and dataset based on params */
// Wrap the data in a Javascript string
res.write("var theData = '");
// Open a read stream for the file
var datastream = fs.createReadStream(
path.resolve(DATASET_EXTRACT_PATH, datatype, dataset, dataset + '.csv')
);
// Pipe the file to the response. Set {end: false} so that res isn't closed
// when the file stream ends, allowing us to continue writing to it.
datastream.pipe(res, {end: false});
// When the file is done streaming, finish the Javascript string
datastream.on('end', function() {
res.end("';");
});
}
MyView.ejs
<script language="javascript" src="/mydata/getdata?datatype=<%=datatype%>&etc.."></script>
MyViewController.js
res.view('analytics', {datatype: 'someDataType', etc...});
A slight variation on this strategy would be to use a JSONP-style approach; rather than wrapping the data in a variable in the data controller action, you would wrap it in a function. You could then call the endpoint via AJAX to get the data. Either way you'd have the benefit of a quick page load since the large data set is loaded separately, but with the JSONP variation you'd also be able to easily show a loading indicator while waiting for the data.

Additional parameters when using intel.xdk.file.uploadToServer?

I'm using Intel XDK to create a smartphone application. Currently I'm uploading a captured photograph by using intel.xdk.file.uploadToServer as shown in their documentation. This is working fully, however I would like to send additional parameters to the back-end (PHP) other than just those required by the 'uploadToServer' function.
What should I do / use?
The uploadToServer file API does not allow you to specify additional parameters than what is documented.
I would use Parse JavaScript APIs that allow you to easily save an object and link to an uploaded file, here is an example:
Parse.initialize("YOUR KEY GOES HERE"); //API key
//whatever you want to call your storage object
var PhotoDetails = Parse.Object.extend("PhotoDetails");
//create new instance of your Parse Object
var photoDetails = new PhotoDetails();
//you can add each param separately and save
photoDetails.set("paramname", "value");
photoDetails.save();
//or use object literal notation
photoDetails.save({category: "landscape",
description: "a very cool photo",
location: "33.38453, -28.234234"
}).then(function(object) {
alert("Photo Recorded!);
});
You can also store the actual photo or filetype in the cloud up to 10MB per file. Parse determines the file type by the file extension or you can specify the type in the optional third param below:
//see https://parse.com/docs/js_guide#files
//for base 64 or HTML file input examples
var parseFile = new Parse.File("myphoto.jpg", fileData, "image/jpg");
parseFile.save().then(function() {
alert("The file has been saved to Parse.");
}, function(error) {
console.log("The file either could not be read, or could not be saved to Parse.");
});
You can associate a Parse File with a Parse Object by using:
photoDetails.set("photoFile", file);
photoDetails.save();
Then in the cloud you can login to Parse and you will see your object type in the Data Browser view with your photo image and all the other params you specified.
For more info see: https://parse.com/docs/js_guide#javascript_guide

Play Framework: How to implement REST API for File Upload

I'm developing a REST API with Play 2 and I'm wondering how to implement file upload functionality.
I've read the official Play documentation but it just provides a multipart/form-data example, while my backend does not provide any form... it just consists of a REST API to be invoked by a JavaScript client or whatever else.
That said, what's the correct way to implement such an API? Should I implement a PartHandler and then still use the mutipartFormData parser? How should I pass the file content to the API? Is there any exhaustive example on this topic?
Any help would be really appreciated.
You should look into BodyParsers: http://www.playframework.com/documentation/2.2.x/ScalaBodyParsers
What you are trying to do is not especially complicated, especially if you are only handling smaller files that would fit in memory. After all uploading a file is just about sending the file as a body of a POST or something like that. It is not any different from receiving some XML or JSON in a request.
Hope this helps
import org.apache.http.entity.mime._
import java.io.File
import org.apache.http.entity.mime.content._
import java.io.ByteArrayOutputStream
import play.api.libs.ws.WS
val contents ="contents string"
val file = File.createTempFile("sample", ".txt")
val bw = new java.io.BufferedWriter(new java.io.FileWriter(file)
bw.write(new_contents);
bw.close();
builder.addPart("file", new FileBody(file, org.apache.http.entity.ContentType.create("text/plain"), "sample"))
builder.setMode(HttpMultipartMode.BROWSER_COMPATIBLE);
val entity = builder.build
val outputstream = new ByteArrayOutputStream
entity.writeTo(outputstream)
val header = (entity.getContentType.getName -> entity.getContentType.getValue)
val response = WS.url("/post/file").withHeaders(header).post(outputstream.toByteArray())
To pass your contents, depending on your client side, you can encode the contents to Base64 at client side to pass the contents as Json (You can use Json body parser). Then on the server side you can decode the contents using a Base64 decoder (e.g. Apache Commons) to get the byte array. It will be as simple as
Base64.decodeBase64(YourEncodedFileContent)
When you have the byte array you can simply write it on disk or save it into database etc. We are using this approach in production and it works fine however we only handle small file uploads.
OK, thank you all for your suggestions... here below is how I solved my issue:
object Files extends Controller {
def upload = SecuredAction[Files.type]("upload").async(parse.multipartFormData(partHandler)) { implicit request =>
future { request.body.files.head.ref match {
case Some((data, fileName, contentType)) => Ok(success(Json.obj("fileName" -> fileName)))
case _ => BadRequest
}}.recover { case e =>
InternalServerError(error(errorProcessingRequest(e.getMessage)))
}
}
...
private def partHandler = {
parse.Multipart.handleFilePart {
case parse.Multipart.FileInfo(partName, fileName, contentType) =>
Iteratee.fold[Array[Byte], ByteArrayOutputStream](
new ByteArrayOutputStream
) { (outputStream, data) =>
outputStream.write(data)
outputStream
}.map { outputStream =>
outputStream.close()
Some(outputStream.toByteArray, fileName, contentType.get)
}
}
}
}
I hope it helps.
while my backend does not provide any form... it just consists of a REST API to be invoked by a JavaScript client
Then your backend is not a REST API. You should follow the HATEOAS principle, so you should respond with links and forms along with data to every GET request. You don't have to send back HTML, you can describe these things with hypermedia json or xml media types, for example with JSON-LD, HAL+JSON, ATOM+XML, etc... So you have to describe your upload form in your preferred hypermedia, and let the REST client to turn that description into a real HTML file upload form (if the client is HTML). After that you can send a multipart/form-data as usual (REST is media type agnostic, so you can send data in any media type you want, not just in a JSON format). Check the AJAX file upload techniques for further detail...

phonegap read and write json file

I am looking to store a JSON file locally on IOS/Android in a Phonegap(Cordova) application.
Basically, I retrieve a JSON file from the server ($.GETJSON) and I want to first store the JSON file and then retrieve and modify it.
I've looked at FileWriter but I don't see a mimetype... the only example gives text files.
Thanks in advance!
Nick
Nick, just use FileWriter.write to save your JSON data to disk. JSON is a text based file anyway so there is no need to set the mime type. When you are ready to load the file again use FileReader.readAsText. In your "onloadend" handler of the FileReader the method will be called with an event and event.target.result will be your JSON data. Then you'll do a
var myJson = JSON.parse(event.target.result);
to turn the text into a JSON object.
(for template or default settings) I just store them in seperate constant files so i don't have to use any special file utility addons or commands, super easy:
(function(){
angular.module('app').constant('settings_json',
{
"name":"settings",
"data":[
{
"set":"globals",
"share_diagnostics":true,
"user_prefs_save_cloud": true
},
{
"set":"user",
"fname":'',
"lname": '',
"telephone": '',
"email": ''
}
]
}
)})();
Then in your app: (after injecting 'settings_json')
var settings = angular.fromJson(settings_json);

Categories