I am encountering a problem on my internal website. Everything was working fine until someone turned the server off, so I had to turn on the server via Docker. The site was up and running fine after that.
However, my colleagues quickly realized that they could not add files to the website, i.e. you first upload files to the site, select save and then it will save the uploaded file. Upon further inspection, I found out that you could upload files from some pages only, i.e. both pages 1 and 2 allow you to upload documents, but only page 2 accepts the upload request and successfully uploads it to the site; page 1 will attempt to upload and return saved successfully, but the files will not have been uploaded.
To provide some context to the situation, I inputted "sudo pkill -f uwsgi -9" before running the server again using Docker. More importantly, the internal website is vital for everyday operations and cannot be down under any circumstances, so I am a bit reluctant to mess with the server in fear of destroying whatever that can still be used. My colleagues are still getting by because they can upload from another page and still have access to the information on the website.
Besides that, the biggest problem is that I have only been in this job for 2 weeks and do not know the code base through and through. The handover process was also bad because the previous guy left 2 weeks prior to me starting this job. I have contacted him and shown him the problem. He told me that he was also unfamiliar with this problem and that the system was running fine the whole time.
The error is as follows:
[error when I try to upload from this page]
[1]: https://i.stack.imgur.com/LnGWs.png [1]
Works fine here:
[works from this page]
[2]: https://i.stack.imgur.com/5QhdT.png [2]
The problem is that I am on a wild goose chase and cannot identify the problem. Moreover, the error suggests that the code base has some error, which conflicts with what the previous guy told me, as I have also seen the system up and running for the past 2 weeks.
Would be very grateful for any ideas!
Related
On cloudflare I want to disable caching and see my website changes immediately that I've pushed live.
Things I've tried:
I've put development mode on.
Create a bypass on caching in page rules.
Purged an individual webpage.
Purged the website.
Set cache to clear every 2 hours.
None of the above worked.
Tech I'm using:
Angular2
SystemJS
Typescript which becomes javascript on build.
Firebase for hosting and database.
Cloudflare for SSL etc.
The only way people see my website changes, it if they hard refresh.
The main problem is I've got a javascript file called app.js and its has all my javascript in for my Angular app. And it doesnt seem like its trying to get the resource in the browser.
I've changed the app.js to app.js?1490959855777
And still doesnt fetch the file again.
I basically want to see my JS file without a user having to hard refresh.
Based on the discussion above, it looks like the caching is happening on the browser - since a hard refresh will get the new file contents.
I think what happened is CF told the browser to hold onto that file for a very log time. And the browser is listening to that request.
Because you can't ask your users to do a hard refresh, you'll need to rename the static files that are being cached so aggressively.
I'm uploading an image(s) for a certain item. I see the files are being stored and saved in the server just fine but yet it gives me a 404 error when the UI's supposed to show it, if I refresh in 2 seconds, it appears correctly.
I know this is weird and that I didn't give much information or code, it would be a mess if I paste all the functions and backend calls I'm doing, I just want to know if someone's has dealed with this kind of things or knows what it can be.
I
Ok, so the problem was that I was uploading the file(s) to the assets folder, and the view can't access to the files there, it searches for the ones in .tmp, so I uploaded them to .tmp and then copied them to assets, so in the next app lift, these files are copied to .tmp again, cause tmp is destroyed as soon as the server is stopped.
I'd like to allow my users to click a button in a list of tunes on a page to open a simple mp3 player (the HTML5 Player is fine) that can play a downloaded mp3 track for that song that is stored on the user's hard drive. Is that even possible? Every attempt I've tried - using HTML and/or JS, JQ - fails.
I can copy the local mp3 file path/filename into my Chrome address bar. With no code at all it helpfully opens an HTML5 player in a new tab that allows me to play the tune just fine. Why is it so difficult to allow the user to do the same thing by simply clicking a button inside my app?
I have been able to get an mp3 player to appear on the page. But no matter how I specify the file path it refuses to play the tune - occasionally telling me my code is not allowed to access local files.
For security reasons, Javascript does not have the privilege to modify files, or even open files on the client machine.
If that is absolutely what you want to accomplish, try to use a JAVA Applet.
Thanks Lyes Ben. Over the last few days thinking about your comments has helped me understand that what I was attempting was not the right approach - and why. After some research I now believe that using the DropBox api I can code a simple 'drop-in-saver' function that would not only automatically save the files locally that my user would generate through the app, but would at the same time, provide another feature that was on my list - it would give the user offline access to those files. As a bonus the files would be synced on all the user's devices, with no additional code or complexity in the app.
Sometimes I get so focused on solving a particular technical problem that I fail to step back and ask if it is the right problem to solve in the first place.
It's not done yet but I'm now working on that DropBox interface to my app. I'll update this answer when (if) I get there as I suspect this could be a solution in some cases for others facing a similar problem.
I have an interesting problem that has me scratching my head a bit, and I have not been able to find anyone who has encountered the same issue.
I have a website that I am developing locally, built off a custom javascript framework that is tested and working. As part of the setup, the site pulls in a JSON file to determine some structures. All links within the entire project are relative links.
Last week, I zipped up my files, threw them on google drive, and downloaded them onto my personal laptop at home. My home machine is running OS X Yosemite, my work machine is on Mavericks.
When I downloaded and unzipped the files on my Yosemite machine, suddenly Safari and Chrome both started kicking back cross domain errors on the XMLHttpRequest that pulls in the JSON file. Nothing has changed in the file configuration, and again, all links are defined as relative links.
At home, I got around this by just setting up a quick vhost and throwing everything onto my development server. However, I don't have admin rights on my work machine, and now that I have brought the files back to this machine to work, the issue has followed along to this machine.
I use the the same browsers to test that I use to browse online on this machine, so I'm not comfortable disabling the restriction in security settings, and, moreover, I see absolutely no reason why this should even be happening.
Has anyone encountered this issue before? What could possibly have changed, and what could I possibly be overlooking? Projects on this machine that use the same javascript framework (and therefore are making the exact same calls to the file system) work fine when launched from Finder. It's only the files that were uploaded to google drive and subsequently downloaded to my Yosemite machine that are kicking back the error.
Any ideas or suggestions would be hugely appreciated. This is proving to be a big pain and I'm rather flummoxed as to the cause.
Thanks!!
Ive done this a million times but this time its having its way with me. I usually work on joomla sites and this one is a drupal site so maybe theres something to that.
In filezillla, i uploaded scripts.js.
I noticed an error in my script. I fix it and reupload it. No failed transfer. I check to make sure the fix works, but find the old version of the code still there.
Ok, I clear chromes cache, reload. Old code still there.
I delete the file off the server. Old code still there. (!!)
I completely barf all over the javascript. Should throw errors. I upload it. Old code still there and no errors.
I change the name of the new scripts.js to scripts2.js and upload it. I navigate to that file via browser URL. Not found. Its right there in the server though, uploaded and refreshed the filezilla pane. Filezilla says its there. Browser says its not. Copied / pasted the path. Theres no spelling mistakes. New file being reported as 404 by browser.
I completely delete project folder off server and reupload. Old code still there. Starting to go nuts.
This has got to be some sort of caching issue.
There is a "clear all caches" function in Drupal. It should be under Settings -> Performance. Try that.