I have an MVC 2 application hosting on an IIS6 server. I have already done all the routing tweaks so that it can browse in the environment. The problem is however, that I have a dynamic partial view creation aspect, where a partial view is loaded each time an add button is clicked. Using Javascript and a controller, I call the partial vie and add it to a table each time.
Javascript Code
<script type="text/javascript">
$(function() {
$("#btnAdd").click(function (e) {
var itemIndex = $("#container input.iHidden").length;
console.debug("itemIndex : "+itemIndex);
e.preventDefault();
var URL = "/WorkOrder/NewItem/" +itemIndex;
$.get(URL,function(data){
$("#container").append(data);
});
});
});
and the controller is
public ActionResult NewItem(int id)
{
var interest = new ItemModel { index = id };
return View("_NewItem", interest);
}
Quite simple really. The funny thing is that it works when in the test localhost environment, but as soon as i deploy it to production, the btnAdd function does nothing. After using the inspect element Network debugging tool, I discovered that the network is returning a 404 error for the partial view.
Do i have to tweak the routing tables more to make them recognize the routing regime i am trying to implement?
Try using Url.Action method instead of just hard coding the URI and pass the data using data parameter.
Example:
var URL = '<%= Url.Action("WorkOrder", "NewItem")%>';
$.get( URL,
{id: itemIndex}
function(data){
$("#container").append(data);
});
I'm working on a small project. Mostly to gain some understanding because I am a complete web noob. Which may or may not be apparent in the following question(s).
My end result is fetching a bunch of values from a database. Putting them into a JSON object and then using some jQuery/JavaScript to work with the data. I've been doing a ton of reading, been through tons of posts on here but I can't find something as it relates to my problem.
I'm working in VS2010 C#. I'm bound by IE7 on the client side, but I can test with Firefox. I'm also working on localHost for now.
My JavaScript is as follows.
function plotMarkers() {
var pathStart = new String(window.location);
var newPath = pathStart.replace("Default.aspx", "getMarkers.aspx");
alert(newPath);
$("#statusDiv").html('JSON Data');
$.getJSON(newPath, function (data) {
alert("Loading JSON");
$.each(data, function (index, elem) {
var myLatlng = new google.maps.LatLng(elem.lat, elem.lng);
var marker = new gogle.maps.Marker({
position: myLatlng,
title: elem.title
});
marker.setMap(map);
});
}, function () { alert("loaded") });}
I don't get any Javascript errors in Firefox, but I do in IE7. The alert("Loading JSON") never fires anywhere. I've never seen it.
My getMarkers.aspx code
public partial class getMarkers : System.Web.UI.Page {
protected void Page_Load(object sender, EventArgs e) {
JavaScriptSerializer jsonSer = new JavaScriptSerializer();
List<marker> markerList = new List<marker>();
markerList.Add(new marker(-34.397, 150.644, "Test"));
markerList.Add(new marker(-50, 150.644, "Test2"));
string sJSON = jsonSer.Serialize(markerList);
Response.ContentType = "application/json";
Response.Write(sJSON);
}
}
I can click the link in "statusDiv" and it takes me to the aspx page. I get an error in IE7 but it works fine in Firefox. The JSON that is parsed back to me in those pages is correct. I copy and pasted the response in another function and the plotter put the two pins on my map.
The code does not seem to enter the $.getJSON function.
There is a problem with IE7 (or a problem in general) with how my getMarkers.aspx is set up. Googled some tutorials and this is where I ended up. I might be completely wrong in my approach. Or I missed something I was supposed to do. I can't seem to find that specific tutorial, it must have been closed on accident amidst my furious 2 day and counting Google adventure.
Firefox getMarkers.aspx displays the JSON as the first line followed by the html. Just plaintext in the browser.
IE7 displays an XML error and tells me the JSON string isn't allowed at the top level of the document. I did read some articles about IE7 always trying to parse XML with an elaborate workaround. Which I can't do, because I don't know how many clients will be using it. Is there a better way to do what I'm doing?
If anyone could point me in the direction I need to go I would greatly appreciate it. Hopefully my post isn't too convoluted.
You probably want a Response.End() at the end of your page load method, so that the rest of the web page isn't rendered after your JSON.
It looks to me, for starters, that you need the json2 library to handle the issues with IE7.
As for firing your function - I can't see anywhere where plotMarkers() is actually called, it's only defined?
Are you trying to display markers on a map in an existing page when the #statusDiv link is clicked ?
If so, you're doing it wrong : dont put a link in #statusDiv, just make sure the getJSON function is called when you click on it. try this :
function plotMarkers() {
var pathStart = new String(window.location);
var newPath = pathStart.replace("Default.aspx", "getMarkers.aspx");
alert(newPath);
$("#statusDiv").text('JSON Data');
$("#statusDiv").on('click',function(){
$.getJSON(newPath, function (data) {
alert("Loading JSON");
$.each(data, function (index, elem) {
var myLatlng = new google.maps.LatLng(elem.lat, elem.lng);
var marker = new gogle.maps.Marker({
position: myLatlng,
title: elem.title
});
marker.setMap(map);
});
}, function () { alert("loaded") });
});
}
I'm trying to write KDE4 plasmoid in JavaScript, but have not success.
So, I need to get some data via HTTP and display it in Label. That's working well, but I need regular refresh (once in 10 seconds), it's not working.
My code:
inLabel = new Label();
var timer= new QTimer();
var job=0;
var fileContent="";
function onData(job, data){
if(data.length > 0){
var content = new String(data.valueOf());
fileContent += content;
}
}
function onFinished(job) {
inLabel.text=fileContent;
}
plasmoid.sizeChanged=function()
{
plasmoid.update();
}
timer.timeout.connect(getData);
timer.singleShot=false;
getData();
timer.start(10000);
function getData()
{
fileContent="";
job = plasmoid.getUrl("http://192.168.0.10/script.cgi");
job.data.connect(onData);
job.finished.connect(onFinished);
plasmoid.update();
}
It gets script once and does not refresh it after 10 seconds. Where is my mistake?
It is working just fine in here at least (running a recent build from git master), getData() is being called as expected. Can you see any errors in the console?
EDIT: The problem was that getUrl() explicitly sets NoReload for KIO::get() which causes it load data from cache instead of forcing a reload from the server. Solution was to add a query parameter to the URL in order to make it force reload it.
I've spent days on this and hit it from every angle I can think of. I'm working on a simple windows 7 gadget. This script will pull JSON data from a remote web server and put it on the page. I'm using jQuery 1.6.2 for the $.getJSON. Script consumes more memory each loop.
var count = 1;
$(document).ready(function () {
updateView();
});
function updateView(){
$("#junk").html(count);
count++;
$.getJSON( URL + "&callback=?", populateView);
setTimeout( updateView, 1000 );
}
function populateView(status) {
$("#debug").html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
Any help would be greatly appreciated....Thank you!
EDIT: Add JSON data sample
?({"queue":{"active_lang":"en","paused":true,"session":"39ad74939e89e6408f98998adfbae1e2","restart_req":false,"power_options":true,"slots":[{"status":"Queued","index":0,"eta":"unknown","missing":0,"avg_age":"2d","script":"None","msgid":"","verbosity":"","mb":"8949.88","sizeleft":"976 MB","filename":"TestFile#1","priority":"Normal","cat":"*","mbleft":"975.75","timeleft":"0:00:00","percentage":"89","nzo_id":"-n3c6z","unpackopts":"3","size":"8.7 GB"}],"speed":"0 ","helpuri":"","size":"8.7 GB","uptime":"2d","refresh_rate":"","limit":0,"isverbose":false,"start":0,"version":"0.6.5","new_rel_url":"","diskspacetotal2":"931.51","color_scheme":"gold","diskspacetotal1":"931.51","nt":true,"status":"Paused","last_warning":"","have_warnings":"0","cache_art":"0","sizeleft":"976 MB","finishaction":null,"paused_all":false,"cache_size":"0 B","finish":0,"new_release":"","pause_int":"0","mbleft":"975.75","diskspace1":"668.52","scripts":[],"categories":["*"],"darwin":false,"timeleft":"0:00:00","mb":"8949.88","noofslots":1,"nbDetails":false,"eta":"unknown","quota":"","loadavg":"","cache_max":"0","kbpersec":"0.00","speedlimit":"","webdir":"","queue_details":"0","diskspace2":"668.52"}})
EDIT 2: Stripped code down to this and it still leaks. I think that eliminates traversing the DOM as a contributor.
$(document).ready(function () {
setInterval(updateView, 1000);
});
function updateView(){
$.getJSON( URL + "&callback=?", populateView);
}
function populateView(status) {
}
EDIT 3: It's not jQuery. I removed jQuery and did it with straight js. Still leaks.
function init(){
setInterval(updateView, 1000);
}
function updateView(){
var xhr = new XMLHttpRequest();
xhr.open("GET", URL, false);
xhr.setRequestHeader( "If-Modified-Since", "0");
xhr.send('');
}
So...if it's not jQuery, not just in IE (Chrome too). What the heck?! Ideas?
Thank you!
Edit 2:
If it's actually taskmanager showing the leak here, then I think the next step is to investigate IE, as I believe that IE is then engine used to host Windows Widgets.
If you can recreate your script in a little html file you can run this tool and have a look if it's IE that's doing it:
http://blogs.msdn.com/b/gpde/archive/2009/08/03/javascript-memory-leak-detector-v2.aspx
Also, are you running IE8 or 9 ?
Edit:
Based on the JSON string in the Op; basically the problem is misleading here.
the bit of javascript posted is working perfectly fine.
The Server producing the JSON is the one that's showing a difference in memory usage, I would investigate the website/endpoint that's creating that JSON and seeing what the issue is.
Just had a thought,
$.getJSON is just a shorthand function for jQuery's $.ajax call.
I wonder if it makes a different if you change your code to use $.ajax but specifically add the cache mechanism to it:
$.ajax({
url: URL + "&callback=?",
dataType: 'json',
cache: false,
success: populateView
});
That might stop it trying to store it in memory perhaps, and depending on your browser, it might be showing more memory because you just haven't had your garbage collected, so to speak.
I have the feeling that the setTimeout function within the updateView is causing this behaviour. To test this you can modify your code to:
$(document).ready(function () {
setInterval(updateView, 1000);
});
function updateView(){
$("#junk").html(count);
count++;
$.getJSON( URL + "&callback=?", populateView);
}
function populateView(status) {
$("#debug").html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
EDIT: The setInterval function will execute the passed in function over and over every x miliseconds. Here to the docs.
EDIT 2:
Another performance loose (Although it might not be critical to the issue) is that you are traversing the DOM every second to find the $('#debug') element. You could store that and pass it in as:
$(document).ready(function () {
var debug = $('#debug');
var junk = $('#junk') ;
setInterval(function(){updateView(debug, junk)}, 1000);
});
function updateView(debug, junk){
junk.html(count);
count++;
$.getJSON( URL + "&callback=?", function(status){populateView(status,debug)});
}
function populateView(status) {
debug.html(status.queue.mbleft + " MB Remaining<br>" + status.queue.mb + " MB Total");
}
Edit 3: I have changed the code above because I forgot to take in the response from the server. Assuming that queue is a property of the returned JSON then the code should be as above.
Edit 4: This is a very interesting issue. Another approach then. Lets assume then that there is still some client side scripts that are clogging the memory. What could this be? As far as is I understand the only two things left are the setInterval and the $.getJSON function. The $.getJSON function is a simple ajax request wrapper which fires a request and waits for the response from the server. The setInterval function is a bit more peculiar one because it will set up timers, fire functions, etc.
I think if you manage to mimic this on your server or even just refresh this webpage in your browser every second/5 secs you you will be able to see whether it is the client or the server that processes your request.
Found this thread trying to find the underlying reason to this problem because I also had a similar problem recently although my memory would increase about 1Mb per minute... I pretty much isolated it to json parsing. Run the ajax command with type: 'text', and you should see that the memory gets cleaned up.
I found a library json_parse.js which recursively parses the json data using the JS engine (not eval). I manually parse the text data to json in the success callback and this works well.
I'm currently programming in JSP and Javascript. (I am by no means an expert in either). Right now, what I want is for a Javascript function to be called repeatedly and one of the variables to be queried from the database repeatedly (it is the date that the page was last modified). If this variable is greater than when the page was loaded, I want the page to refresh.
What I have so far:
...
<body onload="Javascript:refreshMethod()">
<script type="text/JavaScript">
<!--
function refreshMethod()
{
var interval = setInterval("timedRefresh()", 10000);
}
function timedRefresh() {
var currenttime = '<%=currentTime%>';
var feedlastmodified = '<%=EventManager.getFeedLastModified(eventID)%>';
var currenttimeint = parseInt(currenttime);
var feedlastmodifiedint = parseInt(feedlastmodified);
if(feedlastmodifiedint > currenttimeint)
{
alert(feedlastmodifiedint);
setTimeout("location.reload(true);",timeoutPeriod);
}
if(feedlastmodifiedint < currenttimeint)
{
alert(feedlastmodifiedint + " : " + currenttimeint);
}
}
// -->
</script>
The problem is that everytime the timedRefresh runs, the feedlastModifiedInt never changes (even if it has been changed).
Any help would be appreciated,
Thanks.
The JSP code within the <% ... %> tags runs only once, on the server-side, when the page is loaded. If you look at the source of the page in the browser, you will find that these values have already been placed within the JavaScript code, and thus they will not change during each timer interval.
To update the data as you are expecting, you can use AJAX. You can find plenty of tutorials online.
JSP and JavaScript doesn't run in sync as you seem to expect from the coding. JSP runs at webserver, produces a bunch of characters which should continue as HTML/CSS/JS and the webserver sends it as a HTTP response to the webbrowser as response to a HTTP request initiated by the webbrowser. Finally HTML/CSS/JS runs at the webbrowser.
If you rightclick the page in webbrowser and choose View Source, you'll probably understand what I mean. There's no single line of Java/JSP code. It has already done its job of generating the HTML/CSS/JS. The only communication way between Java/JSP and JavaScript is HTTP.
You need to move this job to some servlet in the server side and let JS invoke this asynchronously ("in the background"). This is also known as "Ajax". Here's a kickoff example with a little help of jQuery.
<script src="http://code.jquery.com/jquery-latest.min.js"></script>
<script>
$(document).ready(function() {
var refreshInterval = setInterval(function() {
$.getJSON('refreshServlet', function(refresh) {
if (refresh) {
clearInterval(refreshInterval);
location.reload(true);
}
});
}, 10000);
});
</script>
Where the doGet() method of the servlet which is mapped on an url-pattern of /refreshServlet roughly look like this:
response.setContentType("application/json");
if (EventManager.getFeedLastModified(eventID) > currentTime) {
response.getWriter().write("true");
} else {
response.getWriter().write("false");
}
See also:
Communication between Java/JSP/JSF and JavaScript