my .json file is not accessible from d3.json - javascript

Here is my script to create a line chart. The json file is not accessible from the d3.json () command. I am new to d3.js in general and I am using Dimple.js for plotting charts picking data from back end. I have also listed the json content below from my file.
var svg = dimple.newSvg("#charts", 800, 600);
d3.json("C:/dev/reports/data1.json", function (jsonfile) {
// Create a new chart object based on this data and svg
var myChart = new dimple.chart(svg, data);
myChart.addCategoryAxis("x", "Word");
myChart.addMeasureAxis("y", "Awesomeness");
myChart.addSeries(null, dimple.plot.line);
myChart.draw();
});
JSON content:
[
{ "Word":"Hello", "Awesomeness":2000 },
{ "Word":"World", "Awesomeness":3000 }
]

If you're using Chrome, it may prevent you from opening the file properly because of cross domain security restrictions. Try Firefox to see if that's the case (it will probably let you load the file correctly).
If that is the problem, you will want to install a local web server like WAMP (if you're running Windows) or follow instructions on the wiki page here.
It may help to use relative addressing to locate your data1.json file. if its in the same directory as your html file, just use;
d3.json("data1.json", function (jsonfile) {

Related

How to get custom ttf font working with jsPDF.output()

I've added the jsPDF library to my Titanium project to generate PDFs client side, which has been working great. But now I want to localize the app for Arabic countries, which means that I have the add a custom font. This works perfectly if you use doc.save('file.pdf'), but it doesn't seem to work correctly for doc.output(). I have to use output because I'm using jsPDF outside of a browser.
To make the library work in Titanium I've had to strip all of the references to window, because it's not running in a browser or webview.
I've tried writing the file from different sources, but nothing seems to yield any results.
My current implementation:
doc = new jsPDF();
var f = Ti.Filesystem.getFile(Ti.Filesystem.resourcesDirectory, 'fonts/markazi-text.regular.ttf');
var contents = f.read();
var base64font = Ti.Utils.base64encode(contents).toString();
doc.addFileToVFS("MarkaziText-Regular", base64font);
doc.addFont('MarkaziText-Regular', 'markazi-text', 'normal');
doc.setFontSize(20);
doc.setFont('markazi-text', 'normal');
doc.text('The quick brown fox jumps over the lazy dog', 20, 20);
var tempFile = Ti.Filesystem.getFile(Ti.Filesystem.getTempDirectory(), 'report.pdf');
if (tempFile.exists()) {
tempFile.deleteFile();
}
tempFile.write(doc.output());
I've also tried to write the file from a blob:
var reader = new FileReader();
reader.onloadend = function () {
tempFile.write(reader.result);
};
reader.readAsText(getBlob(buildDocument()));
But the pdf is empty if I use this. I've also tried the library in a webview within a titanium application, which does work but I don't really want to go that road. It would require too many changes to the code.
Expected:
Actual:
I've finally resolved it by creating a local HTML file. In this HTML file I've loaded jsPDF and my own JavaScript to generate a PDF file. I've loaded this HTML file in a WebView.
I'm generating all the data needed for the PDF in an Alloy controller. I'm sending this data to my WebView JavaScript by firing an app event and catching it in the WebView.
After the PDF is created I trigger an app event in the WebView that contains the base64 data of the jsPDF doc:
Ti.App.fireEvent('app:pdfdone', {
output: doc.output('dataurlstring').replace("data:application/pdf;filename=generated.pdf;base64,", "")
});
I finally save this as a file in the Alloy controller:
var f = Ti.Filesystem.getFile(Ti.Filesystem.getTempDirectory(), 'doc.pdf');
f.write(Ti.Utils.base64decode(e.output));

Tableau Workbook won't appear in website when trying to embed with Javascript

So I've been able to successfully embed a workbook hosted on Tableau Server using the html embed code. I now want some more flexibility with which reports I'm displaying to certain users of the website so I'm moving to the Javascript API. Unfortunately the workbook now isn't loading.
I'm following the Basic Embed tutorial on Tableau's website: https://onlinehelp.tableau.com/current/api/js_api/en-us/JavaScriptAPI/js_api_sample_basic_embed.html
Here's my code:
function initViz() {
var placeholderDiv = document.getElementById('tableauPlaceholder');
var url = 'https://#########.#######.###/views/EnrollmentTool/EnrollmentChange'
var options = {
hideTabs: true,
onFirstInteractive: function () {
console.log("Run this code when the viz has finished loading.");
}
};
var viz = new tableau.Viz(placeholderDiv, url, options);
}
The onFirstInteractive log statement doesn't get called, so it seems that the viz isn't actually loading.
I discovered the issue, the API version I was importing into my HTML file was viz_v1.js.
I had to use tableau-2.min.js instead.

When using d3.js, where is the destination for the data source?

I'm looking at this example here for a d3.js chart:
http://bl.ocks.org/d3noob/8952219
Inside the example, the code that represents the data source is as follows:
d3.csv("bar-data.csv", function(error, data) {
data.forEach(function(d) {
d.date = parseDate(d.date);
d.value = +d.value;
});
It appears to point at the bar-data.csv file.
If I were to replicate this example locally, am I suppose to list the .csv file in the same directory as the index.html page?
Also, could I theoretically plug in a URL instead of the 'bar-data.csv' to point to an external data source?
https://github.com/d3/d3-3.x-api-reference/blob/master/CSV.md:
d3.csv(url[[, accessor], callback])
Issues an HTTP GET request for the comma-separated values (CSV) file at the specified url. The file contents are assumed to be RFC4180-compliant. The mime type of the request will be "text/csv".
Therefore bar-data.csv is just a relative path and assumes the file to be in the same directory as the browser execution context (ie. what's in your address bar).

Javascript: Three.js JSON error on remote server, but not on local

I'm writing a Three.js application. In part of it, I load a blender model exported as a JSON file using the Blender->JSON exporter for Three.js. I have WAMPServer 2.2 configured on my local computer (Windows 7) that I use to test my website before I FTP it to the remote server to show off to friends and such.
Loading this JSON file works fine on the local test server, but when I upload it to the server, I get the following error in Firebug, Firefox 16.0.2:
SyntaxError: JSON.parse: unexpected character
var json = JSON.parse( xhr.responseText );
three.js (line 7810)
It's finding the JSON file fine - the GET shows up in Firebug. The loading of the model is, as far as I can tell, the only loading of JSON I have in the entire script; the model also doesn't show up remotely, where it does locally. Here's the function with the load in it:
//Adds a unit to the scene. Assumes Init() hasn't been called on the unit yet.
function pubAddUnit(unit, coord, modelSrc)
{
//Do whatever initialization the unit has to do
unit.Init();
//Store the unit in its position on the grid
units[coord.y][coord.x] = unit;
//Load the unit model
var loader = new THREE.JSONLoader();
loader.load(modelSrc,
//Function called once the unit model is loaded
function(geometry) {
//Create a new material for the unit
var material = new THREE.MeshFaceMaterial();
var mesh = new THREE.Mesh(geometry, material);
//Store the unit geometry and mesh into their respective grids
unit.SetGeo(geometry);
unit.SetMesh(mesh);
//Move the mesh to the correct spot
mesh.position = TransCoord2(coord);
mesh.scale.set(40, 40, 40);
//Add the mesh to the scene
scene.add(mesh);
//Update the scene, now with the mesh in
update();
});
}
And here's the javascript file, as showing on the remote server. Any ideas on why this is happening is appreciated.
EDIT: I'm using FileZilla to FTP. I did suddenly notice that the filesize of the JSON file on the server differs from that of the local, but I'm not sure if that's something I need to worry about or not - perhaps it's line endings or something?
Also, here is the JSON file.
Ok, so I figured out the "problem"... Turns out the file isn't getting the JSON file at all, but simply a response that says "hacked by hacker" - I just didn't look at the GET response closely enough. I think all GETs from the site are being redirected to this file, "hacked by hacker." Obviously, this is outside of the scope of this question, but if anyone has any info that could help, please let me know.

external loaded SVG broken in IE (Raphael JS)

I am using jQuery to load an external SVG (a map of France) and parse it into paths with raphaƫl.js the following code. But it is not doing anything in IE. Any ideas?
$(document).ready(function(){
var paper = Raphael("canvas", 450, 380);
var map = paper.set();
// load svgz map
$.ajax({
type: "GET",
url: "map-smllr.svgz",
dataType: "xml",
success: parseXml
});
// ... removed a few other variables
function parseXml(xml) {
var count = 0;
$(xml).find("g").children("path").each(function()
{
var deptNr = depts[count];
var path = $(this).attr("d");
var c = paper.path(path);
c.attr(attr).attr("title",deptNr);
map.push(c);
count++;
});
//startMap();
}
});
You can view a full source here: http://ngjulie.com/map/raphael.html
I have a funky caching issue in Chrome too, where a blank spot is shown until the user hovers over the canvas.
But the biggest problem is that this is not working in IE. The general examples on the RaphaelJS website work fine. So it must be something in my code.
Any ideas?
Cheers,
Julie
It seems not to be working because the svgz and svg images are being served with an image/svg+xml mimetype, which is causing the IE XML parser to fail (if set an error condition in the $.ajax call, you'll see this happening - this good practice anyways). Likewise, if you navigate to http://ngjulie.com/map/map-smllr.svgz or http://ngjulie.com/map/map-smllr.svg in IE, you'll see it attempts to download the file, rather than parsing it with the IE XML parser component.
I think if you serve the files with a text/xml or application/xml mimetype it should probably work. I tested this quickly by renaming map-smllr.svgz to map-smllr.xml, thus making it easy for my web server to serve the file with the correct mimetype. If you navigate to that file in IE8, you'll see that it gets parsed as XML. Likewise, the XHR GET succeeds, and is able to parse the file. Everything else then works as expected.

Categories