I have an interesting problem that has me scratching my head a bit, and I have not been able to find anyone who has encountered the same issue.
I have a website that I am developing locally, built off a custom javascript framework that is tested and working. As part of the setup, the site pulls in a JSON file to determine some structures. All links within the entire project are relative links.
Last week, I zipped up my files, threw them on google drive, and downloaded them onto my personal laptop at home. My home machine is running OS X Yosemite, my work machine is on Mavericks.
When I downloaded and unzipped the files on my Yosemite machine, suddenly Safari and Chrome both started kicking back cross domain errors on the XMLHttpRequest that pulls in the JSON file. Nothing has changed in the file configuration, and again, all links are defined as relative links.
At home, I got around this by just setting up a quick vhost and throwing everything onto my development server. However, I don't have admin rights on my work machine, and now that I have brought the files back to this machine to work, the issue has followed along to this machine.
I use the the same browsers to test that I use to browse online on this machine, so I'm not comfortable disabling the restriction in security settings, and, moreover, I see absolutely no reason why this should even be happening.
Has anyone encountered this issue before? What could possibly have changed, and what could I possibly be overlooking? Projects on this machine that use the same javascript framework (and therefore are making the exact same calls to the file system) work fine when launched from Finder. It's only the files that were uploaded to google drive and subsequently downloaded to my Yosemite machine that are kicking back the error.
Any ideas or suggestions would be hugely appreciated. This is proving to be a big pain and I'm rather flummoxed as to the cause.
Thanks!!
Related
I am encountering a problem on my internal website. Everything was working fine until someone turned the server off, so I had to turn on the server via Docker. The site was up and running fine after that.
However, my colleagues quickly realized that they could not add files to the website, i.e. you first upload files to the site, select save and then it will save the uploaded file. Upon further inspection, I found out that you could upload files from some pages only, i.e. both pages 1 and 2 allow you to upload documents, but only page 2 accepts the upload request and successfully uploads it to the site; page 1 will attempt to upload and return saved successfully, but the files will not have been uploaded.
To provide some context to the situation, I inputted "sudo pkill -f uwsgi -9" before running the server again using Docker. More importantly, the internal website is vital for everyday operations and cannot be down under any circumstances, so I am a bit reluctant to mess with the server in fear of destroying whatever that can still be used. My colleagues are still getting by because they can upload from another page and still have access to the information on the website.
Besides that, the biggest problem is that I have only been in this job for 2 weeks and do not know the code base through and through. The handover process was also bad because the previous guy left 2 weeks prior to me starting this job. I have contacted him and shown him the problem. He told me that he was also unfamiliar with this problem and that the system was running fine the whole time.
The error is as follows:
[error when I try to upload from this page]
[1]: https://i.stack.imgur.com/LnGWs.png [1]
Works fine here:
[works from this page]
[2]: https://i.stack.imgur.com/5QhdT.png [2]
The problem is that I am on a wild goose chase and cannot identify the problem. Moreover, the error suggests that the code base has some error, which conflicts with what the previous guy told me, as I have also seen the system up and running for the past 2 weeks.
Would be very grateful for any ideas!
Can anyone tell me why this only works when I upload it and not when I open it in a browser on my local system.
https://teac.lenguax.com/tests/ADP-Driver/ADP-001/testBed.html
The only difference I do see is that, when it is uploaded, I get https:// and when it is on my local system I get file:/// - is there any way I can make this work on my local system without having a local webserver?
Any help would be greatly appreciated.
Cheers,
Tyrone
...when it is on my local system I get file:/// - is there any way I can make this work on my local system without having a local webserver?
Security around the file: scheme is browser-specific. Chrome is fairly restrictive, whereas Firefox is more permissive. So you might try Firefox.
But: Lots of things are slightly different and off when you use content from the file: scheme rather than http:/https:. I strongly recommend using a local web server when doing browser programming. Any good IDE will have one built in, and there are many Node modules (for instance) that will run your project in a local server as well. (And installing a local server isn't all that difficult.)
I'm encountering an issue with some local web prototyping;
I've been working on a single page which access files on my C:/ by starting my Chrome with --allow-file-access-from-files.
This is great, I've got my first page working successfully and it loads in my .js,.css files etc as expected.
However, when I click the link to proceed to the next page, the HTML loads, but none of the styles, javascript (or even images) load.
I'm receiving 'Failed to load resource' errors in the console, despite the file:// url pointing to the correct location.
Is there any way around this issue?
In lieu of a solution, some advice: Set up a web server on your computer for testing. Developing in an environment that's similar to a "production" environment, as opposed to working around the quirks of local file access, will save you quite a bit of time in the long run.
There are a number of tools that will help you set up a development web server; XAMPP is a popular one.
There is probably a better title for I'd like to accomplish, but the details should be helpful.
I've recently learned that specifying a script's src path as //some.domain.com rather than http://some.domain.com or https://some.domain.com will cause the browser to request the script using whichever protocol was used to load the page. This works great when the page is loaded from a site, but often I debug on my local system, so the protocol is file, and of course errors occur whenever resources or scripts aren't found.
Other than changing src paths, is there a better way to debug locally? I imagine there is code solution that detects when the page is running locally versus loaded from a domain, but I haven't found examples yet.
Install a product such as wampserver, then you'll have a localhost webserver you can test everything on. This is how I do it, works like a charm.
There are similar products available for ASP or other non-PHP server-side technologies (you didn't specify), if you are just doing HTML + JS then any old server would do.
Ok so I'm lost here, frustrated and pulling my hair and out. Plus probably about to be fired or take a pay cut.
I moved Files from a development server to my local machine. The files are consistent (used diff tool), all the dependencies are there. It works for the most part. The problem is that the some of the javascript (not all) is just not working. We're using jquery and a lot of plugins for it. I've checked with the web developer plugin in firefox and all the js files are loading. I cleared the cache in both firefox and chrome multiple times to no avail. The development server is a windows server running wamp. My local machine is running ubuntu. Somebody tell me what I missed.
Download firebug as a Firefox extension and view the http request and responses.
Easiest may be from within the 'net' tab to determine if your script is making a request.
Very likely that it is a source domain issue. There are no work-around for this issue. The ajax request and the source data must be on the same domain.
It may have something to do with JavaScript's security limitations. (In certain circumstances) You can only operate on URLs or pages from the current domain, which most likely changed when you moved the files off the other server. More here.
Are you running the files via a webserver, or just opening the files directly? If it's the latter, you'll want to set up a server on your local machine for local testing, and serve the files using it. Otherwise, you'll very likely run into the domain restrictions others have mentioned above.
You may need to host the site using a local server. VS IDE has an add-on called live server. You need to set up a workspace in order for it to work. The port used on my machine was 5500.
You need to make sure any dependencies for javascript are running on your server or the javascript will not be executed. These dependencies are listed in the json file.
ex. If you require express, you need to be running node or the javascript won't execute in your web browser.
In the terminal:
node app.js
Any dependencies that are not installed and running on the server will not execute.
Are you accessing the html web pages through the webserver and not simply double clicking the file to open it?
Also if you have WebDeveloper toolbar installed the click "Disable", "Disable Javascript" and make sure "All Javascript" isn't ticked.