Printing from javascript without header and footer: state of the art? - javascript

When I print content using javascript, the browser automatically adds header and footer (url/date/pagenr). Currently there seems to be no way to suppress this from the webapp-side.
Css3 might eventually be a solution for it (e.g. with #page, #top-left styles), but currently doesnt seem to work here (winvista chrome 17.0.942.0 / firefox 9.0). When is it supposed to come to the browsers?
Another solution might arise with chrome-browser: With above version the printdlg is not the modal system-printdlg but rendered within the website (there is also a checkbox to disable header+footer). Now that chrome has remade the printdlg, chrome might also provide a api to control printing using javascript?
Are there other solutions on the horizont? It cant be the final state, that to print from the browser with full control pdf- or other plugins are needed.

Do this:
#page {
margin: 0;
}
Done!

Currently, Javascript is very limited to accessing resources "outside the browser" like hardware and the file system for security reasons. Hinting from this trend, I doubt programmatically controlling how prints come out will be in Javascript's future. I say this because having those headers and footers (despite how ugly they are) should still ultimately be the user's decision.
Even with CSS3, you are still talking about reaching content outside of the HTML document itself. Those headers and footers are set by internal browser functions. However, Chrome does create an easy UI to get rid of them when printing.
However, especially with Chrome, there is a lot of power in their extensions, especially if you use NDAPI plugins (though this just poses another security risk). This route is very technical, but could probably be "another solution."

Related

input type=button with and without form tag

I have created lots of buttons for a large number of pages (usually 5-10 in a row at the bottom of each page inside a table cell) using input type="button" name="..." value="..." onclick="some javascript event handler" etc, basically to link to other pages of the same group. All these pages are ultimately linked from an iframe tag on a single page. The buttons are working fine offline on my PC at least. But, now I've suddenly realized that I haven't used any 'form' tag for these buttons.
So my question is, is this 'form' tag really necessary? Will there by any problem after I upload? I would prefer not to add the form tag now to so many pages if it's not really necessary, because that's going to be a real drag. But, I don't want to suffer afterwards either.
No it is not necessary as long as you are not doing any Get/Post and grouping form elements together. They should work completely fine without a form tag.
There are two issues to be concerened with:
Is it valid HTML?
Turns out that it is valid HTML (see Is <input> well formed without a <form>?), so you are on the safe side here.
Will all common browsers accept it?
After googling around I haven't seen any information on problems wih this use of Input tags. That suggests that all common browsers accept this valid HTML (as they should). When developing any website that is accessible to the general public I would always do a manual cross-browser check to discover any abnormalities certain browsers may habe with my site.
Problem is that you most likely won't be able to tell from looking at your server logs whether certain browsers have a problem with your HTML. It may just not work on IE 6 and you would never be able to find unless a disgruntled customer calls up and informs you.
If this is a generally accessible website get some stats on the most commonly used browsers, decide which ones you want to support, and verify that the website is behaving properly on thiee browsers. This is a pain in the ass, but there is no way I know of to get around that. Browsers may not react to valid HTML properly.
As a rule of thumb, Firefox, Chrome, and Safari unsually behave well, and because of auto-updates most people will have a very recent version installed. If the latest version of the browser works I wouldn't be too concerned that users with some older version will have trouble.
The real test is always Internet Explorer. While version 8 and 9 are pretty standards-compliant, IE 7 certainly needs checking. IE 6 is the worst offender for unusual behavior. It was introduced in 2002, but today still 6% of the population use it! Most of this comes from cracked copies of Windows XP in China, but it is also used quite a lot in corporate networks, where OS and browsers are centrally deployed, and administrators have not managed to progress since early 2000.
In conclusion: Your code is unusual but ok, test it manually on the browsers you expect.

how to set browser compatibility in netbeans jsp application

I'm making a jsp web based application some of the buttons and labels are not visible with internet explore but it is good with google chrome.How do i set application compatiblity so that user can see it in very good in handling with all browser
The browser compatibility is not dependent on the Java/JSP code, but on the HTML/CSS/JS code it generates. You, as being a web developer, should usually have full control over this. If you use a HTML strict doctype and write HTML code according the w3 standards (i.e. the page passes at least the w3 validator) and you use jQuery for writing JavaScript functions, then you have basically nothing to worry about.
Left behind CSS, this may be a pain when you use a wrong HTML doctype which pushes MSIE in the so-called "quirks mode" (which reveals the MSIE box model bug). So you'd like to get at least the doctype straight first (to start, use the HTML5 <!DOCTYPE html>). This should solve most of MSIE CSS issues and you can then fix the remaining CSS issues individually. More than often it concerns IE6/7 CSS specific bugs which boils down to the hasLayout bug. It's impossible to elaborate all those bugs in detail in a single answer. For that you can better ask a separate and specific question here whenever you stucks with fixing an individual CSS issue.

What happens to my web application if JavaScript is disabled?

I'm learning jQuery and am about to write some pages using intensively that library. I just learned that some user disable JavaScript on their browser (I didn't even know that was possible and/or necessary).
Now, here's my question: What happens to my web application if a user disable JavaScript? For instance, I'd like to display some screens using AJAX and commands such as 'InsertBefore' to bring in live a DIV that will display the result.
So, if JavaScript is disabled, I wonder what going to happen to all this work that relies on JavaScript?
I'm kind of lost.
Thanks for helping
You may want to start by reading on Progressive Enhancement and Unobtrusive JavaScript.
I would also suggest to investigate how popular rich web applications like GMail, Google Maps and others, handle these situations.
I just learned that some user disable javascript on their browser
I do. The "NoScript" plugin for FireFox does the trick.
So, if Javascript is disabled, I wonder what going to happen to all this work that relies on Javascript?
It won't be functional.
A good practice suggests designing a site not to rely on JavaScript for major functionality. At least, accessing its content (in read-mode) should be possible. JavaScipt should only add interface enhancements like Ajax techniques etc. But the fallback version should always work.
I feel really sad when I see a site which is completely broken without JavaScript. Why can't people use CSS to put elements in proper places? Why do they try to align elements with JavaScript even if there is no dynamics involved?
The same goes for Flash sites. Once in a while a land upon a "web-design-agency" site which makes picky comments about me not allowing JavaScript. When I do I only see a basic primitive site with a few menus and that's it. What was the point of using Flash when the work is so primitive it can be done with raw HTML and CSS in an hour? For me it's a sign of unprofessional work.
All what's done in JavaScript won't work. Some users disable it for security reasons, NoScript is an excellent example. You can try it yourself by removing the scripts from your page or installing the NoScript-plugin for Firefox.
As a rule of thumb:
Make the website working with only semantic HTML
add the CSS
add the JS
But the website should be (almost) fully functional in stage 1.
If you disable Javascript in Safari things like Lexulous in Facebook won't work properly, the mouse letter carry function doesn't work.

HTML validation and loading times

Does having (approx 100) HTML validation errors affect my page loading speeds? Currently the errors on my pages don't break the page in ANY browser, but I'd spend time and clear those anyhow if it would improve my page loading speed?
If not on desktops, how about mobile devices like iPhone or Android? (For example, N1 and Droid load pages much slower than iPhone although they both use Webkit engine.)
Edit: My focus here is speed optimization not cross-browser compatibility (which is already achieved). Google and other biggies seem to use invalid HTML for speed or compatibility of both?
Edit #2: I'm not in quirks mode, i.e. I use XHTML Strict Doctype and my source looks great and its mostly valid, but 100% valid HTML usually requires design (or some other kind of) sacrifice.
Thanks
It doesn't affect -loading- speed. Bad data is transferred over the wires just as fast as good data.
It does affect rendering speed though (...in some cases... ...positively! Yeah, MSIE tends to be abysmally slow in standards mode) In most cases though, render speed will be somewhat slower due to Quirks mode which is less efficient, more paranoid and generally instead of just executing your data like a well-written program, it tries its best to fish out some meaningful content from what is essentially a tag soup.
Some validation errors like missing ALT or no / at the end of single-element tags won't affect render at all, but some, like missing a closing tag or using antiquated obsolete parameters may impact performance seriously.
It might affect loading speed, or it might not. It depends on the kind of errors you're getting.
I'd say that in most cases it's likely that it will be slower because the browser will have to handle these errors. For instance if you forgot to close a div tag, some browsers will close it for you. This takes processing time and increase the loading time.
I don't think the time delta between no error and 100 errors would be minimal. But if you have that many errors, you should consider fixing your code :)
Probably yes, and here's why.
If your code is valid to the W3C doctype you are using then the browser doesn't have to put more effort in to try and fix your code. This is called quirks mode, and it would be logical that if your code were to validate, the browser wouldn't have to try and piece the website back together.
Remembering it's always beneficial to make your code validate, if only to ensure a consistent design across the popular browsers. Finally you'll probably find that you fix the first few errors and your list of 100 errors will drastically decrease.
In theory, yes it will decrease page load times because the browser has to do less to handle errors and so on.
However it does depend on the nature of the validation errors. If you're improperly nesting tags (which actually may be valid in HTML4) then the browser would have to do a little more work working out where elements start and end. And this is the kind of thing that can cause cross-browser problems.
If you're simply using unofficial attributes (say, the target attribute on links) then support for that is either built into the browser or not. If the browser understands it, it will do something with it, otherwise it will ignore the attribute.
One thing that will ramp up your validation errors is using <br> under XHTML or <br /> under HTML. Neither should increase loading times (although <br /> takes a fraction longer to download).

iframe shimming or ie6 (and below) select z-index bug

Uhm I'm not sure if anyone has encountered this problem
a brief description is on IE6 any <select> objects get displayed over any other item, even div's... meaning if you have a fancy javascript effect that displays a div that's supposed to be on top of everything (e.g: lightbox, multibox etc..) onclick of a certain element and that div overlaps a <select> your div get's to be displayed as if it's under the <select> [on this case a max and minimum z-index doesn't work ]
I've tried googling and found the iframe shim solution
but I wanted some pretty clean alternatives
or better yet has anyone found a better solution?
since the method using iframes uses around 130mb of ram might slow down poor people's machines
You don't have to hide every select using a loop. All you need is a CSS rule like:
* html .hideSelects select { visibility: hidden; }
And the following JavaScript:
//hide:
document.body.className +=' hideSelects'
//show:
document.body.className = document.body.className.replace(' hideSelects', '');
(Or, you can use your favourite addClass / removeClass implementation).
There is a plugin for jquery called bgiframe that makes the iframe method quite easy to implement.
Personally, as a web developer, I'm to the point where I no longer care about the user experience in IE6. I'll make it render as close to "correct" as possible, and make sure it's functional, but as far as speed goes, too bad. They can upgrade. IE7 (though still quite slow, compared to every other browser) has been out for 2 years (almost to the day!). IE8 is going to be out shortly. Firefox is available for every platform. Safari is also an option (and super fast). Opera is available for most/every platform.
IE6 was released in over 7 years ago. IMHO, there is no reason to still be using it, other than lazy users and incompetent IT departments (or if you're a web developer).
in case anyone is interested, here's some IE shimming code.
* html .shimmed {
_azimuth: expression(
this.shimmed = this.shimmed || 'shimmed:'+this.insertAdjacentHTML('beforeBegin','<iframe style="filter:progid:DXImageTransform.Microsoft.Alpha(style=0,opacity=0);position:absolute;top:0px;left:0px;width:100%;height:100%" frameBorder=0 scrolling=no src="javascript:false;document.write('+"''"+');"></iframe>'),
'inherit');
}
ref: this gist by subtleGradient and this post by Zach Leatherman
Prior to IE7 the drop down list was a "windowed" control meaning that it was rendered as a control directly by Windows rather than the browser synthesizing it. As such, it wasn't possible for it to support z-indexing against other synthesized controls.
In order to appear over a DDL, you must use another windowed control, like IFRAME. You can also use a little known IE-only feature called window.createPopup() which essentially makes a chromeless popup. It has limitations, like unstoppable click-out, but they are actually kinda helpful if you are building a hover menu system.
The simplest and most elegant solution to that annoying IE bug is found at: http://docs.jquery.com/Plugins/bgiframe using jQuery.
I reached that conclusion after trying for 2 days to make it work with WebSphere Portal / Portal Applications where everything is dynamic, including the fly-over menu.
There's also the activex method, which I'm starting to explore. It requires creating conditional code to use an activex control instead of a select box for ie6. There's a demo script showing the technique, which is discussed in more detail here.
Update: it appears that MS Office is required for the active-x control to be on the user's machine. In theory, it might be possible to include that somewhere, somehow, but that's getting a lot messier.
I know many people suggested their own tips, but in my case, I just simply hide select using jquery like the below.
$(':date').dateinput({
format: 'dd/mm/yyyy',
onBeforeShow: function(event) {
$('select').hide();
},
onHide: function(event) {
$('select').show();
}
});

Categories