I'm encountering an issue with some local web prototyping;
I've been working on a single page which access files on my C:/ by starting my Chrome with --allow-file-access-from-files.
This is great, I've got my first page working successfully and it loads in my .js,.css files etc as expected.
However, when I click the link to proceed to the next page, the HTML loads, but none of the styles, javascript (or even images) load.
I'm receiving 'Failed to load resource' errors in the console, despite the file:// url pointing to the correct location.
Is there any way around this issue?
In lieu of a solution, some advice: Set up a web server on your computer for testing. Developing in an environment that's similar to a "production" environment, as opposed to working around the quirks of local file access, will save you quite a bit of time in the long run.
There are a number of tools that will help you set up a development web server; XAMPP is a popular one.
Related
On cloudflare I want to disable caching and see my website changes immediately that I've pushed live.
Things I've tried:
I've put development mode on.
Create a bypass on caching in page rules.
Purged an individual webpage.
Purged the website.
Set cache to clear every 2 hours.
None of the above worked.
Tech I'm using:
Angular2
SystemJS
Typescript which becomes javascript on build.
Firebase for hosting and database.
Cloudflare for SSL etc.
The only way people see my website changes, it if they hard refresh.
The main problem is I've got a javascript file called app.js and its has all my javascript in for my Angular app. And it doesnt seem like its trying to get the resource in the browser.
I've changed the app.js to app.js?1490959855777
And still doesnt fetch the file again.
I basically want to see my JS file without a user having to hard refresh.
Based on the discussion above, it looks like the caching is happening on the browser - since a hard refresh will get the new file contents.
I think what happened is CF told the browser to hold onto that file for a very log time. And the browser is listening to that request.
Because you can't ask your users to do a hard refresh, you'll need to rename the static files that are being cached so aggressively.
I know there are similar questions, but I could not find one explaining what I am trying to do.
At one of the events I will be working, the MC will need to play music from his browser (it has been set up like that to update all live schedules).
The problem that I have is I get the Not allowed to load local resource error when I try to load the audio file from the local drive.
The reason I am trying to load the file from the local drive is for in case the network fails or something happens to the local server, then the event can still continue.
I have read that Chrome gives this error for privacy and security reasons, but Firefox does not load the file and gives no error for doing so.
Is there a browser where this will be possible or is there a way to change browser settings to allow this?
I have tried using the Flash settings to add the file's location as a trusted location, I am however unable to find a flash settings that says "Load from local disk (only)".
Thanx in advance.
No, it's not possible to load files from the local machine for security reasons. Imagine what I could read from your machine if it was >:D
You have to run your code on a web server, and also host the file there. You can easily install IIS if you're on Windows as it's included as an additional component. There's also XAMPP which is free.
Is Local file mapping dev tools alternative to Charles proxy in firefox? I am trying to map a remote server resource to a local file, but it doesn't seem to be working. The console statements and changes in the JS file doesn't seem to be applied when I reload the page.
I can see the local folder and file listed in the sources folder. Also, see that the sources doesn't show the original JS file. But I see this message in the sources tab for the local file.
Workspace mapping mismatch
Rest of the warning states that the file in local folder is different from the remote loaded file. Why is that a problem, wouldn't that be always the case, since you want to edit the file locally?
What am I missing? Any pointers to fixing this? Is my assumption wrong that this feature in chrome dev tools can allow loading a resource locally, as if it was loaded from the original location?
I tested this again with a simple html page with a single js file having a simple console log statement - "loading remote file ...". This file is mapped to a local js with a different log statement "loading local file.." However, I still see the log message from the remote file.
Added snapshot from dev tools sources tab for more context. Sources tab shows the local folder and file correctly, but shows the mapping warning. Also notice that sources doesn't have the remote.js file anymore.
Is my assumption wrong that this feature in chrome dev tools can allow loading a resource locally, as if it was loaded from the original location?
I don't think this is accurate. When you map a file on a server to your local workingspace, Chrome acts as a sort of editor for your local files. You can edit the files through Chrome and Command+S to save your local files. But nothing has changed on the server. It doesn't update the files on the server, and it doesn't tell Chrome to "Use my local files instead of what's on the server".
What many people do is automate the deployment process so that when a local file is updated (either through Chrome + Workspace Mapping or just simply by editing in your editor), your working copy gets deployed. That way, next time you reload the browser, you'll see your edits.
Edit: From the workspace documentation:
And you can map resources served from a local web server to files on disk, so when you change and save those files, you can view them as if they were being served.
I think the key here is local web server. I did a bit more digging and found this dev tools docs issues, with a comment effectively saying that what you're trying to do isn't supported:
The DevTools currently does not do resource substitution. It can simply map the remote files to your local copy so if things are kept in sync (like using a local server on-system) then when refreshing your modifications can persist.
Looks like you'll need a way to deploy after making changes or have your devtools workspace point to the server docroot.
The Charles Proxy "map local" feature was requested of the Chromium team in this issue and the team declined to pursue it.
I have an interesting problem that has me scratching my head a bit, and I have not been able to find anyone who has encountered the same issue.
I have a website that I am developing locally, built off a custom javascript framework that is tested and working. As part of the setup, the site pulls in a JSON file to determine some structures. All links within the entire project are relative links.
Last week, I zipped up my files, threw them on google drive, and downloaded them onto my personal laptop at home. My home machine is running OS X Yosemite, my work machine is on Mavericks.
When I downloaded and unzipped the files on my Yosemite machine, suddenly Safari and Chrome both started kicking back cross domain errors on the XMLHttpRequest that pulls in the JSON file. Nothing has changed in the file configuration, and again, all links are defined as relative links.
At home, I got around this by just setting up a quick vhost and throwing everything onto my development server. However, I don't have admin rights on my work machine, and now that I have brought the files back to this machine to work, the issue has followed along to this machine.
I use the the same browsers to test that I use to browse online on this machine, so I'm not comfortable disabling the restriction in security settings, and, moreover, I see absolutely no reason why this should even be happening.
Has anyone encountered this issue before? What could possibly have changed, and what could I possibly be overlooking? Projects on this machine that use the same javascript framework (and therefore are making the exact same calls to the file system) work fine when launched from Finder. It's only the files that were uploaded to google drive and subsequently downloaded to my Yosemite machine that are kicking back the error.
Any ideas or suggestions would be hugely appreciated. This is proving to be a big pain and I'm rather flummoxed as to the cause.
Thanks!!
Ok so I'm lost here, frustrated and pulling my hair and out. Plus probably about to be fired or take a pay cut.
I moved Files from a development server to my local machine. The files are consistent (used diff tool), all the dependencies are there. It works for the most part. The problem is that the some of the javascript (not all) is just not working. We're using jquery and a lot of plugins for it. I've checked with the web developer plugin in firefox and all the js files are loading. I cleared the cache in both firefox and chrome multiple times to no avail. The development server is a windows server running wamp. My local machine is running ubuntu. Somebody tell me what I missed.
Download firebug as a Firefox extension and view the http request and responses.
Easiest may be from within the 'net' tab to determine if your script is making a request.
Very likely that it is a source domain issue. There are no work-around for this issue. The ajax request and the source data must be on the same domain.
It may have something to do with JavaScript's security limitations. (In certain circumstances) You can only operate on URLs or pages from the current domain, which most likely changed when you moved the files off the other server. More here.
Are you running the files via a webserver, or just opening the files directly? If it's the latter, you'll want to set up a server on your local machine for local testing, and serve the files using it. Otherwise, you'll very likely run into the domain restrictions others have mentioned above.
You may need to host the site using a local server. VS IDE has an add-on called live server. You need to set up a workspace in order for it to work. The port used on my machine was 5500.
You need to make sure any dependencies for javascript are running on your server or the javascript will not be executed. These dependencies are listed in the json file.
ex. If you require express, you need to be running node or the javascript won't execute in your web browser.
In the terminal:
node app.js
Any dependencies that are not installed and running on the server will not execute.
Are you accessing the html web pages through the webserver and not simply double clicking the file to open it?
Also if you have WebDeveloper toolbar installed the click "Disable", "Disable Javascript" and make sure "All Javascript" isn't ticked.