Currently I am getting data from a hardware device(charge/load controller) over WiFi, it has an ESP8266 configured as an AccessPoint.
The WiFi is setup to ignore all request from computer, just send its data once per second.
The data is a single string representing about 20 JavaScript Variables...
var xx1="text1";
var xx2="text2"; etc...
I get the data by refreshing the HTML5 page, process with JavaScript & logging to localStorage.
It all works well except I can only refresh about 3 second interval minimum for reliable consistent data-logging. The browser (FireFox) takes a while to complete refresh.
Q. Is there a way I can get every 'data send' using JavaScript without page refresh, this way I can log just the periodic strings I choose from 1 second to xxx second.
I suspect I might need to install some library component to access with my JavaScript ?, i would need to embed this into my HTML file if possible or have it reside in the same folder.
I have been learning JS for about 2 weeks now, getting most from examples & my mistakes.
Related
The Issue
Recently, I deployed a page to production containing javascript which used setInterval() to ping a webservice every few seconds. The behavior can be summed up as follows:
Every X seconds, Javascript on the upcomingEvents.aspx page calls the hitWebService() function, which sits in the hitWebService.js file.
The X interval value set proved to be too small, so, I removed all references to hitWebService(), the hitWebService.js file itself, and the web service it was trying to reach.
Attempts to hit the web service from normal IP addressess dropped off, but I am still getting attempted hits from a number of users who use a proxy service.
My theory is that my upcomingEvents.aspx and hitWebService.js have been cached by the proxy service. Indeed, when I log the referrer strings when a user hits the error page (every so often, one of these users will get redirected here), they are being referred from upcomingEvents.aspx.
The issue is that the attempts to hit this web service are filling up the IIS logs at an uncomfortable rate, and are causing unnecessary traffic on the server.
What I have attempted
Removed web service completely
Deleting the hitWebService.js file, also replaced it with dummy file
Changed content expiration on IIS so that content expires immediately
Added Response.Cache.SetCacheability(HttpCacheability.NoCache) to the .vb codebehind on page
Completely republished site with changes
Restarted IIS, stopped and started IIS.
Interesting bits
I can alter the vb codebehind on UpcomingEvents.apsx to log session details, etc, and it seems to update almost instantly for the proxy service users
The Question
If my theory is correct, and the proxy server is indeed caching these the files hitWebService.js and upcomingEvents.aspx, are there any other routes that I can go down to force a code refresh, considering the above strategies haven't worked?
Thanks a lot,
In my case, i had a ajax call begin cached by asp.net. I used a param with the javascript date number so each call have a different querystring.
like this:
function addQueryStringAntiCache(url)
{
var d = new Date();
var n = d.getTime();
return url + (url.indexOf("?") == -1 ? "?" : "&") + "nocache=" + n;
}
you can do the same thing for script:
<script src="myScript.js?v=8912398314812" />
In case you have access to the machine, use fiddle to check it the browser even call to get the files or if it use it's own cache. In that case you can try to include metaData or http Header to prevent it caching it. Check this response
Hope it help you.
Currently I'm working on a project where a user enters a lot of data constantly for a hour long window. I'm looking to have one user control all the data via some control panel and then have a link they can distribute to other users that will allow them to view that data without the ability to edit it.
Right now I'm doing some extremely weird methods. I have an XHR request on the control page that fires whenever a field is finished being edited. From there the data is sent to a php file that converts the data into a simple text file. Then the distributed link file will load that file one time and translate it into the necessary format.
Some potential problems I've run into are it seems odd that I'm sending starting as javascript data then going to a php file then to a text file then translating the data all the way back into javascript data again. Another problem I've come into is I'm not sure of a way to force users to reload the page when a field is edited in the control panel after the user has opened the view page.
Have I totally gone overboard here? What are some better concepts I could employ to accomplish this task?
If i understand what you want to do this is how i will do this:
First the data entry
if you have lot of fields you better use a form wizard, i don't have a particular one in mind right now but there is lot of them just search jQuery Form wizard
Here is an example:
http://i.stack.imgur.com/Luk2b.jpg
The concept of the form wizard is to guide user via multiple page and also validate the data. And click save when and the end.
Then save date in database.
Display content
All you need to do is to create a global separate page to display your content.
Let see something like: http://yourserver.com/view/{id}
where id is the identifier of the particular row in your database.
i'm not sure if i totally understand what u about to do. i'm trying to make your work description shorter here:
want to build a website that one person can edit a single page's content in 1 hour, and others can view the content change in that 1 hour.
if this is what u want to build, here's the module:
teacher: the one who can edit the page
student: the one who can only view the page
server: information center
teacher client edits page -> teacher client sends update data to server -> server saves data -> server sends update notice to student client -> student client receives update notice -> student fetches update data from server
to make this module work well, i suggest try socket instead of http reqeust, just like online games or IMs do.
well, try socket.io
Summary:
I am attempting to pass base64 encoded image data via a form field input. My code works fine on all browsers I've tested it on, but there is severe amount of CPU lag, post submit, on Google Chrome - the length of which is proportional to the length of data submitted.
Details:
What I'm Doing:
I have an SVG editor on my site in which users may create images to be saved to their profile. Once the user finishes their work, they click 'save' - which kicks off some javascript to convert the SVG into an encoded data string via canvas.toDataURL(), store it in a hidden input field, submit the form, and return the user to an overview of their designs.
What's the problem?
The code, itself, seems to be functioning without an issue across both Firefox and Google Chrome. Firefox page loads take 1-2 seconds, regardless of the data_string size. However, on Google Chrome, the time it takes to load the 'overview' page is proportional to the size of the data string submitted in the hidden field.
For example, if I truncate the data string at various lengths, I receive different page load times:
Test Image 1:
5000 chars - 1.78 sec
50000 chars - 8.24 sec
73198 chars - 11.67 sec (Not truncated)
Test Image 2:
5000 chars - 1.92 sec
50000 chars - 8.79 sec
307466 chars - 42.24 sec (Not truncated)
My Question:
The delay is unacceptable (as most images will be at least 100k in size); does anyone know what's going on with Google Chrome?
I would like to reiterate that the server responds with the same speed, regardless of browser; it is definitely a client-side, browser specific issue with Google Chrome.
I would also appreciate alternative suggestions. I've spent some time attempting to fool the browser into thinking the data was a file upload (by changing the text input to a file input field and then manually trying to form the data and submit it via javascript, but I can't seem to get Django to recognize the falsified file (so it errors out, believing that no file was uploaded).
Summary
Google Chrome seems to have a problem handling large amounts of data when said data is placed into an actual input field. I suspect it's an issue with Chrome attempting to clean up the memory used to display the data.
Details
I was able to achieve a workaround by doing away with the client-side form, entirely, and submitting the data via a javascript XMLHttpRequest (as I had touched on at the end of my question), then redirecting the user to the next page in the AJAX callback.
I could never get Django to recognize a manually formed FileField object (as multipart/form-data), but I was able to get it to accept a manually formed CharField string (which was my base64 encoded canvas data).
Because the data is never placed into an input field, Google Chrome responds without delay.
I hope that helps anyone who may run across a similar issue.
I was also having the exact same problem, I was searching for a solution.
In my case there was no such problem for the initial few runs of the page.
Then it suddenly started to lag eating up a large amount of memory which in turn made my whole system running very slow.
I tried in another PC like what expected there was no problem submitting the big sized svg data for the first few runs but later it is also showing the same lagging problem.
After reading your post i am planning to use jquery's ajax for posting the data . I hope this will solve the issue.
I have some images in a folder save automatically from a web camera naming by current date_time. Now i just want to load each image after some seconds(let say 4 sec) which matches to my server current date_time.
Using java script..
I can get server time using PHP
**
I more simple words, Is there any
jquery plugin that load images from
folder with respect to image name
where the name is based on current
date_time?
**
Thanks
You can't know the exact server time from from javascript(unless your server and your computer are the same one). Getting time from server using any of the server languages to synchronize time between client and server will not work because of the response time of server. How about another idea. Write a page on server that will return list of last images(or last image), query it with javascript and show the last image(s).
I need to do as much as possible on the client side. In more details, I would like to use JavaScript to code an interface (which displays information to the user and which accepts and processes response from the user). I would like to use the web serve just to take a date file from there and then to send a modified data file back. In this respect I would like to know if the following is possible in JavaScript:
Can JavaScript read content of a external web page? In other words, on my local machine I run JavaScript which reads content of a given web page.
Can JavaScript process values filled in a HTML form? In other words, I use HTML and JavaScript to generate an HTML form. User is supposed to fill in the form and press a "Submit" button. Then data should be sent to the original HTML file (not to a web server). Then this data should be processed by JavaScript.
In the very end JavaScript will generate a local data-file and I want to send this file to a PHP web server. Can I do it with JavaScript?
Can I initiate an execution of a local program from JavaScript. To be more specific, the local program is written in Python.
I will appreciate any comments and answers.
It could technically, but can't in reality due to the same origin policy. This applies to both reading and writing external content. The best you can do is load an iframe with a different domain's page in it - but you can't access it programmatically. You can work around this in IE, see Andy E's answer.
Yes for the first part, mmmm not really for the second part - you can submit a form to a HTML page and read GET arguments using Javascript, but it's very limited (recommended maximum size of data around 1024 bytes). You should probably have all the intelligence on one page.
You can generate a file locally for the user to download using Downloadify. Generating a file and uploading it to a server won't be possible without user interaction. Generating data and sending it to a server as POST data should be possible, though.
This is very, very difficult. Due to security restrictions, in most browsers, it's mostly not possible without installing an extension or similar. Your best bet might be Internet Explorer's proprietary scripting languages (WScript, VBScript) in conjuction with the "security zones" model but I doubt whether the execution of local files is possible even there nowadays.
Using Internet Explorer with a local file, you can do some of what you're trying to do:
It's true that pages are limited by the same origin policy (see Pekka's link). But this can be worked around in IE using the WinHttpRequest COM interface.
As Pekka mentioned, the best you can manage is GET requests (using window.location.search). POST request variables are completely unobtainable.
You can use the COM interface for FileSystemObject to read & write local text files.
You can use the WScript.Shell interface's Exec method to execute a local program.
So just about everything you asked is attainable, if you're willing to use Internet Explorer. The COM interfaces will require explicit permission to run (a la the yellow alert bar that appears). You could also look at creating a Windows Desktop Gadget (Vista or Win 7) or a HTML Application (HTA) to achieve your goal.
Failing all that, turn your computer into a real server using XAMPP and write your pages in PHP.
see i got what you want to do
best things is do following
choose a javascript library (eg:jquery,dojo,yui etc), i use jquery.this will decrease some of your load
inspite of saving forms data in in a local file, store them in local variables process them and send them to server (for further processing like adding/updating database etc) using XMLHttp request, and when webservice returns data process that data and update dom.
i am showing you a sample
--this is dom
Name:<input type='text' id='name' />
<a href='javascript:void(0)' onClick='submit()'>Submit Form</a>
<br>
<div id='target'></div>
--this is js
function submit()
{
var _name=$('#name').val();// collect text box's data
//now validate it or do any thing you want
callWebservice(_name,_suc,_err);
//above call service fn has to be created by you where you send this data
//this function automatically do xmlHttprequest etc for you
//you have to create it ur self
}
//call this fn when data is sucessfully returned from server
function _suc(data)
{
//webservice has returned data sucessefully
//data= data from server, may be in this case= "Hello user Name"; (name = filled in input box);
//update this data in target div(manipulate dom with new data);
$('#target').html(data);
}
function _err()
{
//call this fn when error occurs on server
}
// in reality most of the work is done using json. i have shown u the basic idea of how to use js to manipulate dom and call servcies and do rest things. this way we avoid page-reloads and new data is visible to viewer
I would answer saying there's a lot you can do, but then in the comment to the OP, you say "I would like to program a group game."
And so, my answer becomes only do on the client side what you are able and willing to double check on the server side. Never Trust the Client!
And I do not want to do my job twice.
If you are going to do things on the client side, you will have to do it twice, or else be subject to rampant cheating.
We had the same question when we started our project.In the end we moved everything we could on the JS side. Here's our stack:
The backend receives and send JSON data exclusively.We use Erlang, but Python would be the same. It handles the authentication/security and the storage.
The frontend, is in HTML+CSS for visual elements and JS for the logic.A JS template engine converts the JSON into HTML. We've built PURE, but there are plenty of others available. MVC can be an overkill on the browser side, but IMO using a template engine is the least separation you can do.
The response time is amazing. Once the page and the JS/CSS are loaded(fresh or from the cache), only the data cross the network for each request.