I have a project I am doing that requires delivery on a DVD and through the web. I have been using Flash to drive a menu system and javascript to load pages or other actions on the web. However, when I move it to a DVD I receive a Security Error 2060 - the swf is unable to
communicate with the html page it is loaded onto and so none of the javascript is parsed. I am using ExternalInterface calls and jquery on the html page.
Searching online I have made sure that Flash when publishing "Allow local files only" and on the html page I'm using swfobject with a param field of allowscriptaccess of "always" - looking at the generated code on pages it shows that the allowscriptaccess is there.
Is there some security setting that I can program in that will give my Flash application the ability to function the same from a DVD as it would from the web and communicate with Javascript? If I need to compile two different swfs that would be okay.
I suspect you are having a local sandbox problem. Have you gone through the information at http://www.adobe.com/devnet/flashplayer/security.html?
If you can do PC-only, then investigate Server2Go. This is a standalone WAMP stack that works well from a CD/DVD. Your page will then run in the internet zone, and you should not run into the same security problems.
Related
I have successfully implemented the Smart Card Reader in my project using the HTML and everything is perfectly 100% working but when i transfer all files to my MVC project and run (localhost) I am unable to execute the Initialize function of the Card Reader. Saying that "Access is denied"
In HTML (all functions are working without access denied error)
In MVC
My object
<OBJECT id="EIDAWebComponent" style="border:solid 1px gray"
CLASSID="CLSID:A4B3BB86-4A99-3BAE-B211-DB93E8BA008B"
width="130" height="154"></OBJECT>
Update:
I created a new ASP.Net Web Application (Empty Template) then i copy the whole js and html file. Then set the HTML page as start page. The problem might be in my localhost. When i browse my file in IE using this link C:\Public Data ActiveX\PublicDataActiveX.html the activex is working. But this link http://localhost:28679/PublicDataActiveX.html its not.
The problem might be simply to enable HTTPS in order to have ActiveX to work.
if your stack makes this hard, you can have a proxy on top, like nginx:
http://cnedelcu.blogspot.com.co/2014/10/https-with-nginx-setting-up-ssl.html
You just shared a small portion of code and some pics so it is hard to help this way, but below some tips that you may have to consider:
When you say, it is working in pure html, what does that mean exactly? how do you browse it, double click in an html file and open in the browser as a local file with a path similar to "C:\Folder\File.hml"?
Or do yow browse it with a domain, maybe "localhost/file.html"?
When you say, move to an MVC project, what do you mean? did you put it on a View or you just moved the html and browsed it?
All the questions above are important in order to find where your problem is
If you can browse the .html file in both situations from file system and running your MVC project then the problem is in the MVC itself and it is a coding stuff.
If you cannot browse the same .html exactly the same in file system and running your app, then the problem is not code, it is permissions, Maybe CORS restrictions (by domain), maybe ssl (you said not), maybe you need a key (like google api)
After doing a lot of searching I've found this link
Changing the security settings in IE my ActiveX is now working in localhost.
Open IE >> Tools >> Internet Options >> Security >> Custom Level >>(Enable) Initialize and script ActiveX controls not marked as safe for scripting
I would like to automatize web tasks which normally should be achieved by a web browser, for example video upload to a certain video sharing website. I would like to make a pure command line program, which apply simple cURL command calls, because those calls are accessible for example in FireBug and Chrome / Chromium dev console / Network pane. So I woudn't like to use libcurl, or similar libraries. I would prefer programming in Ruby.
Task is straightforward: I upload a video while watching the dev tool network pane, and tracking the communication between the browser and the server. Copying POST and GET requests by "copy as cURL" menu. Applying some modification on the copied cURL command, eg. removing some header lines, which sends cookies, and substitute them with the cookies in a "cookie jar" text file (-c option in cURL). And later sending the needed cookies by applying that text file again (-b option in cURL). In the past I managed to make such Ruby scripts and they are simply working I can use those websites services by pure command line, so I can upload files from my VPS which is very fast unlike uploading from home machine.
Unfortunately the website I want to automatize apply a lot of redirections even at the login stage (for example 4 consecutive redirects), which aren't tracked by Chrome dev tool, so I can't see what really happen and when the needed cookies are stored, and which request is responsible for getting those cookies. Sometimes tricky javascript calls are applied by the website to store a cookie which is needed for the video upload and even exporting the video.
So my question is that besides Chrome dev tools and FireBug is there any automated and handy tool which can help achiving similar tasks?
Maybe BrowserAutomationStudio will help:
https://bablosoft.com/shop/BrowserAutomationStudio
This program can record your browser actions and replay them as a standalone bot.
I'm currently working on an application that uses the Phonegap/Cordova framework to display an online and an offline version of a website. If you're not familiar w/ this framework, it offers a simple way of creating multi-platform applications by displaying local files in a full-screen webview.
When launching the application, the Javascript integrated in the local files of the application detects if Internet access if available, and redirects the user to either another local webpage containing a full-screen iFrame of the live website, or a reduced offline version of the website (contained in the local files of the app) if no Internet connection is detected.
I would like to detect when the user logs in using the various forms on the website (being displayed inside the iFrame), but I have no way of knowing which page the user is on, or interact w/ the website content at all because of the same-origin policy.
Would it be possible though to make the Javascript from the local page (which contains the iFrame) interact w/ the Javascript from the remote page (which is being displayed in the iFrame)? This way, I would be able to obtain the login information, and save it for later use (obviously not w/o using a token system), but also it would help for another planned feature (trigger the guidance system).
Thank you.
Look into HTML5 communication, it's pretty simple and sounds like it fits your needs
http://stevehanov.ca/blog/index.php?id=109
https://developer.mozilla.org/en-US/docs/Web/API/Window/postMessage
We have an existing app that uses Javascript and embeds flash/flex. We require to make the whole thing work offline at short notice and wonder what the fastest way to make flex work offline is. The offline app in mind would be a window that brings up a web browser with the JS files stored locally. The flex app currently loads mp3 files and data files from the server.
Is there any way to make flex load files from the file system automatically without the user having a web server setup? I know it can be done with AIR but we would prefer not to do that. Also if we are forced to use AIR, is it possible to embed an air application in a web page the same way as you embed a flex app? The flex app is embedded seamlessly as shown here:
Go to www.eyespeakpro.com Click on "free trial" after choosing your gender.
2 Then click on "Conversation in daily life" course, click on the first lesson, click "go"
Ignore that lesson and click on the speech bubble 2nd from the left in the bottom right corner. This brings up the flex app, and if you watch the network traffic you can see the files being downloaded if you click the right arrow for the next sentence etc.
Thanks.
You can run a swf file offline. Due security, an swf file is not allowed to load local files. I'm not sure if you use the external interface with javascript. I don't think a local swf have acces to local javascript.
Have a look to the compiler option 'use-network' (http://help.adobe.com/en_US/flex/using/WS2db454920e96a9e51e63e3d11c0bf69084-7a92.html) You may set this option to true for loading local files/mp3's
If you only want to show the application local on known computers, you may have a look at the security manager (Flash < 10.3) http://www.macromedia.com/support/documentation/nl/flashplayer/help/settings_manager04.html#117502 to trust a folder or swf on your computer.
For Flash > 10.3 you should have a look at your config panel to trus a swf (http://www.macromedia.com/support/documentation/nl/flashplayer/help/settings_manager.html)
I have a business web app that needs to pull in information from various other web sites. For most sites, the user just instructs the server to pull the data (either using .NET's HttpRequest, or Selenium).
But for some unfriendly, Javascript-heavy sites, our users have to visit the site manually, navigate to the right spot, and copy and paste into our application.
Other than bookmarklets, is there any way for our page to show an IFRAME with the source web site loaded, allow the user to navigate within the frame, and then capture the IFRAME's body?
Since the site in the IFRAME isn't in the same domain (not even close), I can't seem to work around browser cross-site scripting limitations. I've tried using HTML5's "sandbox" feature, but it appears to only allow communication (via "allow-same-origin") the other way, from the IFRAME to the host site, which isn't useful to me. Also, it doesn't work if the site in question attempts to load its frames to the top context.
What I'm ideally looking for is a solution that would allow the browser to be configured to trust my web site implicitly (it's an intranet app) and allow it to access any frame's contents. That would at least get me in the ballpark. Bonus points if I can get the iframe to redefine the "top" context as its own frame, so the hosted site functions properly within the frame.
The best approach I've found through many many screen scraping projects (scraping JS heavy pages) is to create a user-script or Greasemonkey script, setup a few virtual machines in their own IP space (for protection) and feed them a list of sites to visit from a remote program:
Check the queue at a set interval
Request page with Greasemonkey, etc.
Capture contents and send to remote program for processing
You can't use an iframe method and you are going to bang your head up against a wall trying to go that route, the method I've described has worked for numerous large-scale scraping projects.