I have made a cordova project via the create script and have created a plugin.
However, for some reason, the JS-callbacks are not invoked whenever i send an pluginresult.
I have checked the callbackID. Its filled with a (seemingly) valid callback.
My plugin mapping is ok, the native code gets executed and there are no errors.
I have no clue as to why callbacks arent fired..
My javascript :
(function(cordova) {
function Plugin() {}
Plugin.prototype.FBAuthorize = function(config) {
cordova.exec(null, null, "Plugin", "FBAuthorize", []);
};
Plugin.prototype.FBGetLoginStatus = function(cb, fail) {
cordova.exec(function(){alert('test')}, function(){alert('test')}, "Plugin", "FBGetLoginStatus", []);
}
Plugin.prototype.FBPostToUserTimeline = function(textToShare, imageUrl, linkUrl) {
cordova.exec(null, null, "Plugin", "FBPostToUserTimeline", [textToShare, imageUrl, linkUrl]);
}
if(!window.plugins) window.plugins = {};
window.plugins.Plugin = new Plugin();
})(window.cordova || window.Cordova || window.PhoneGap);
my ios
(void) FBGetLoginStatus:(CDVInvokedUrlCommand*)command {
CDVPluginResult* result = nil;
NSString* javascript = nil;
if([[FBSession activeSession] isOpen]) {
result = [CDVPluginResult resultWithStatus:CDVCommandStatus_OK messageAsString:#"Logged in"];
javascript = [result toSuccessCallbackString:command.callbackId];
} else {
result = [CDVPluginResult resultWithStatus:CDVCommandStatus_ERROR messageAsString:#"Not logged in"];
javascript = [result toErrorCallbackString:command.callbackId];
}
[self writeJavascript:javascript];
}
i'm not a expert in phonegap but this works for me you can try (iOS Code)
[self.commandDelegate sendPluginResult:result callbackId:command.callbackId];
instead
[self writeJavascript:javascript];
i hope this help
Related
I need to create web browser using CefSharp.Wpf with ability to give fake data to site for example CPU cores, browser plugins, platform name etc.
There are site that can retrieve all this info: https://www.deviceinfo.me/
My quesiton is: How to hide GPU info from this site? Using javascript or CefSharp functionality
I have tried to redefine WebGLRenderingContext.getParameter method, which gives an info about GPU renderer and vendor:
var canvas = document.createElement('canvas');
var gl;
try {
gl = canvas.getContext("webgl2") || canvas.getContext("webgl") || canvas.getContext("experimental-webgl2") || canvas.getContext("experimental-webgl");
} catch (e) {
}
var oldParam = WebGLRenderingContext.prototype.getParameter;
WebGLRenderingContext.prototype.getParameter = function(parameter){
console.log("we have guests");
if(parameter == debugInfo.UNMASKED_RENDERER_WEBGL){
return "GTX 1080";
}
if(parameter == gl.getExtension("WEBGL_debug_renderer_info").UNMASKED_RENDERER_WEBGL){
return "GTX 1080";
}
if(parameter == debugInfo.UNMASKED_RENDERER_WEBGL){
return "NVidia";
}
if(parameter == gl.VERSION){
return "GTX 1080";
}
return oldParam(parameter);
};
I expected to completely redefine this method and return some fake info, but when i called gl.getParameter(param) again, it still gave me an old gpu info
If you still want Canvas2D and WebGL to still work then you can't hide since they can finger print by actually rendering.
You could disable them with
HTMLCanvasElement.prototype.getContext = function() {
return null;
};
Though the fact they don't exist is also a data point.
Otherwise your wrapper appears to have some issues.
First you really should set the function before creating the context.
Second your last line should be
oldParam.call(this, parameter);
Also you didn't show debugInfo but you can use WebGLRenderingContext instead or you can just hard code the numbers
As for http://www.deviceinfo.me you need to make sure your patch runs in all iframes and workers before any other JavaScript.
WebGLRenderingContext.prototype.getParameter = function(origFn) {
const paramMap = {};
paramMap[0x9245] = "Foo"; // UNMASKED_VENDOR_WEBGL
paramMap[0x9246] = "Bar"; // UNMASKED_RENDERER_WEBGL
paramMap[0x1F00] = "Nobody"; // VENDOR
paramMap[0x1F01] = "Jim"; // RENDERER
paramMap[0x1F02] = "Version 1.0"; // VERSION
return function(parameter) {
return paramMap[parameter] || origFn.call(this, parameter);
};
}(WebGLRenderingContext.prototype.getParameter);
// --- test
const gl = document.createElement('canvas').getContext('webgl');
const ext = gl.getExtension('WEBGL_debug_renderer_info');
show(gl, gl, [
'VENDOR',
'RENDERER',
'VERSION',
]);
if (ext) {
show(gl, ext, [
'UNMASKED_VENDOR_WEBGL',
'UNMASKED_RENDERER_WEBGL',
]);
}
function show(gl, base, params) {
for (const param of params) {
console.log(param, ':', gl.getParameter(base[param]));
}
}
There is WebGLRenderingContext and WebGL2RenderingContext
I am having a piece of visual C++ code (firebreath) that retrieves data from a visual C# application. The communication works fine for negative cases(returns value without out parameter) but in positive case(returns value with out parameter) the following error is shown.
Error: Error calling method on NPObject!
I guess the problem is in the out parameter of Visual C# application. Can anyone help me in this?? Kindly use the below code for reference.
ngContent.js: (js to call firebreath function)
function GetDetails(param1, param2) {
try {
return document.getElementById("nGCall").ReturnDetails(param1, param2);
}
catch (e) {
alert("Exception Occured " + e.message);
}
};
nGAPI.cpp: (Firebreath function to call C# application)
FB::VariantList nGAPI::ReturnDetails(std::wstring& param1, std::wstring& param2)
{
try {
InetGuardNPAPI *CSharpInterface = NULL;
//Open interface to C#
CoInitialize(NULL);
HRESULT hr = CoCreateInstance(CLSID_netGuardIEBHO, NULL, CLSCTX_INPROC_SERVER, IID_InetGuardNPAPI, reinterpret_cast<void**>(&CSharpInterface));
BSTR out1= NULL;
BSTR out2= NULL;
BSTR out3= NULL;
BSTR out4= NULL;
int returns = CSharpInterface->NPGetDetails(param1.c_str(), param2.c_str(), &out1, &out2, &out3, &out4);
if (out1 != NULL && out2 != NULL) {
return FB::variant_list_of(out1)(out2)(out3)(out4);
} else {
return FB::variant_list_of();
}
} catch (...) {
MessageBoxW(NULL, L"Exception occured.", L"NG", NULL);
return FB::variant_list_of();
}
nGHost.cs: (Visual C# application)
public int NPGetDetails(string param1, string param2, out string out1, out string out2, out string out3, out string out4)
{
int retValue = 0;
out1 = null;
out2 = null;
out3 = null;
out4 = null;
bool userIdentified = IdentifyUser(ngDB, out ngdata);
if (!userIdentified)
return retValue;
try {
out1 = ngdata.abc;
out2 = ngdata.def;
out3 = ngdata.ghi;
out4 = ngdata.jkl;
retValue = 1;
} catch (Exception ex) {
MessageBox.Show(ex.Message);
}
return retValue;
}
Thanks in advance.
The error you're getting indicates that an exception was thrown. Unfortunately the browsers stopped exposing exceptions when they switched to out of process plugins. I'd recommend attaching a debugger and stepping through to see what is actually happening; alternately, add some logging.
I am recording a sound file ( wav format) in objective C. I want to pass this back to Javascript using Objective C stringByEvaluatingJavaScriptFromString. I am thinking that I will have to convert wav file to base64 string to pass it to this function. Then I will have to convert base64 string back to (wav/blob) format in javascript to pass it to audio tag to play it. I don't know how can I do that? Also not sure if that is best way to pass wave file back to javascript? Any ideas will be appreciated.
well, this was not straight forward as I expected. so here is how I was able to achieve this.
Step 1: I recorded the audio in caf format using AudioRecorder.
NSArray *dirPaths;
NSString *docsDir;
dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
docsDir = [dirPaths objectAtIndex:0];
soundFilePath = [docsDir stringByAppendingPathComponent:#"sound.caf"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:AVAudioQualityMin],
AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16],
AVEncoderBitRateKey,
[NSNumber numberWithInt:2],
AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100],
AVSampleRateKey,
nil];
NSError *error = nil;
audioRecorder = [[AVAudioRecorder alloc]
initWithURL:soundFileURL
settings:recordSettings error:&error];
if(error)
{
NSLog(#"error: %#", [error localizedDescription]);
} else {
[audioRecorder prepareToRecord];
}
after this, you just need to call audioRecorder.record to record the audio. it will be recorded
in caf format. If you want to see my recordAudio function, then here it is.
(void) recordAudio
{
if(!audioRecorder.recording)
{
_playButton.enabled = NO;
_recordButton.title = #"Stop";
[audioRecorder record];
[self animate1:nil finished:nil context:nil];
}
else
{
[_recordingImage stopAnimating];
[audioRecorder stop];
_playButton.enabled = YES;
_recordButton.title = #"Record";
}
}
Step 2: Convert the caf format to wav format. This I was able to perform using following function.
-(BOOL)exportAssetAsWaveFormat:(NSString*)filePath
{
NSError *error = nil ;
NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[ NSData data], AVChannelLayoutKey, nil ];
NSString *audioFilePath = filePath;
AVURLAsset * URLAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil];
if (!URLAsset) return NO ;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
if (error) return NO;
NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
if (![tracks count]) return NO;
AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
assetReaderAudioMixOutputWithAudioTracks:tracks
audioSettings :audioSetting];
if (![assetReader canAddOutput:audioMixOutput]) return NO ;
[assetReader addOutput :audioMixOutput];
if (![assetReader startReading]) return NO;
NSString *title = #"WavConverted";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
NSString *outPath = [[docDir stringByAppendingPathComponent :title]
stringByAppendingPathExtension:#"wav" ];
if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL])
{
return NO;
}
soundFilePath = outPath;
NSURL *outURL = [NSURL fileURLWithPath:outPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
fileType:AVFileTypeWAVE
error:&error];
if (error) return NO;
AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
outputSettings:audioSetting];
assetWriterInput. expectsMediaDataInRealTime = NO;
if (![assetWriter canAddInput:assetWriterInput]) return NO ;
[assetWriter addInput :assetWriterInput];
if (![assetWriter startWriting]) return NO;
//[assetReader retain];
//[assetWriter retain];
[assetWriter startSessionAtSourceTime:kCMTimeZero ];
dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );
[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
NSLog(#"start");
while (1)
{
if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) {
CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];
if (sampleBuffer) {
[assetWriterInput appendSampleBuffer :sampleBuffer];
CFRelease(sampleBuffer);
} else {
[assetWriterInput markAsFinished];
break;
}
}
}
[assetWriter finishWriting];
//[self playWavFile];
NSError *err;
NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err];
[self.audioDelegate doneRecording:audioData];
//[assetReader release ];
//[assetWriter release ];
NSLog(#"soundFilePath=%#",soundFilePath);
NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err];
NSLog(#"size of wav file = %#",[dict objectForKey:NSFileSize]);
//NSLog(#"finish");
}];
well in this function, i am calling audioDelegate function doneRecording with audioData which is
in wav format. Here is code for doneRecording.
-(void) doneRecording:(NSData *)contents
{
myContents = [[NSData dataWithData:contents] retain];
[self returnResult:alertCallbackId args:#"Recording Done.",nil];
}
// Call this function when you have results to send back to javascript callbacks
// callbackId : int comes from handleCall function
// args: list of objects to send to the javascript callback
- (void)returnResult:(int)callbackId args:(id)arg, ...;
{
if (callbackId==0) return;
va_list argsList;
NSMutableArray *resultArray = [[NSMutableArray alloc] init];
if(arg != nil){
[resultArray addObject:arg];
va_start(argsList, arg);
while((arg = va_arg(argsList, id)) != nil)
[resultArray addObject:arg];
va_end(argsList);
}
NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil];
[self performSelectorOnMainThread:#selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:#"NativeBridge.resultForCallback(%d,%#);",callbackId,resultArrayString] waitUntilDone:NO];
[resultArray release];
}
Step 3: Now it is time to communicate back to javascript inside UIWebView that we are done recording
the audio so you can start accepting data in blocks from us. I am using websockets to
transfer data back to javascript. The data will be transferred in blocks
because server(https://github.com/benlodotcom/BLWebSocketsServer) that I was using, was build using
libwebsockets(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/).
This is how you start the server in delegate class.
- (id)initWithFrame:(CGRect)frame
{
if (self = [super initWithFrame:frame]) {
[self _createServer];
[self.server start];
myContents = [NSData data];
// Set delegate in order to "shouldStartLoadWithRequest" to be called
self.delegate = self;
// Set non-opaque in order to make "body{background-color:transparent}" working!
self.opaque = NO;
// Instanciate JSON parser library
json = [ SBJSON new ];
// load our html file
NSString *path = [[NSBundle mainBundle] pathForResource:#"webview-document" ofType:#"html"];
[self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]];
}
return self;
}
-(void) _createServer
{
/*Create a simple echo server*/
self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol];
[self.server setHandleRequestBlock:^NSData *(NSData *data) {
NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSLog(#"Received Request...%#",convertedString);
if([convertedString isEqualToString:#"start"])
{
NSLog(#"myContents size: %d",[myContents length]);
int contentSize = [myContents length];
int chunkSize = 64*1023;
chunksCount = ([myContents length]/(64*1023))+1;
NSLog(#"ChunkSize=%d",chunkSize);
NSLog(#"chunksCount=%d",chunksCount);
chunksArray = [[NSMutableArray array] retain];
int index = 0;
//NSRange chunkRange;
for(int i=1;i<=chunksCount;i++)
{
if(i==chunksCount)
{
NSRange chunkRange = {index,contentSize-index};
NSLog(#"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index);
NSData *dataChunk = [myContents subdataWithRange:chunkRange];
[chunksArray addObject:dataChunk];
break;
}
else
{
NSRange chunkRange = {index, chunkSize};
NSLog(#"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize);
NSData *dataChunk = [myContents subdataWithRange:chunkRange];
index += chunkSize;
[chunksArray addObject:dataChunk];
}
}
return [chunksArray objectAtIndex:0];
}
else
{
int chunkNumber = [convertedString intValue];
if(chunkNumber>0 && (chunkNumber+1)<=chunksCount)
{
return [chunksArray objectAtIndex:(chunkNumber)];
}
}
NSLog(#"Releasing Array");
[chunksArray release];
chunksCount = 0;
return [NSData dataWithBase64EncodedString:#"Stop"];
}];
}
code on javascript side is
var socket;
var chunkCount = 0;
var soundBlob, soundUrl;
var smallBlobs = new Array();
function captureMovieCallback(response)
{
if(socket)
{
try{
socket.send('start');
}
catch(e)
{
log('Socket is not valid object');
}
}
else
{
log('socket is null');
}
}
function closeSocket(response)
{
socket.close();
}
function connect(){
try{
window.WebSocket = window.WebSocket || window.MozWebSocket;
socket = new WebSocket('ws://127.0.0.1:9000',
'echo-protocol');
socket.onopen = function(){
}
socket.onmessage = function(e){
var data = e.data;
if(e.data instanceof ArrayBuffer)
{
log('its arrayBuffer');
}
else if(e.data instanceof Blob)
{
if(soundBlob)
log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size);
if(e.data.size != 3)
{
//log('its Blob of size = '+ e.data.size);
smallBlobs[chunkCount]= e.data;
chunkCount = chunkCount +1;
socket.send(''+chunkCount);
}
else
{
//alert('End Received');
try{
soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
var myURL = window.URL || window.webkitURL;
soundUrl = myURL.createObjectURL(soundBlob);
log('soundURL='+soundUrl);
}
catch(e)
{
log('Problem creating blob and url.');
}
try{
var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record';
var xhr = new XMLHttpRequest();
xhr.open('POST',serverUrl,true);
xhr.setRequestHeader("content-type","multipart/form-data");
xhr.send(soundBlob);
}
catch(e)
{
log('error uploading blob file');
}
socket.close();
}
//alert(JSON.stringify(msg, null, 4));
}
else
{
log('dont know');
}
}
socket.onclose = function(){
//message('<p class="event">Socket Status: '+socket.readyState+' (Closed)');
log('final blob size:'+soundBlob.size);
}
} catch(exception){
log('<p>Error: '+exception);
}
}
function log(msg) {
NativeBridge.log(msg);
}
function stopCapture() {
NativeBridge.call("stopMovie", null,null);
}
function startCapture() {
NativeBridge.call("captureMovie",null,captureMovieCallback);
}
NativeBridge.js
var NativeBridge = {
callbacksCount : 1,
callbacks : {},
// Automatically called by native layer when a result is available
resultForCallback : function resultForCallback(callbackId, resultArray) {
try {
var callback = NativeBridge.callbacks[callbackId];
if (!callback) return;
console.log("calling callback for "+callbackId);
callback.apply(null,resultArray);
} catch(e) {alert(e)}
},
// Use this in javascript to request native objective-c code
// functionName : string (I think the name is explicit :p)
// args : array of arguments
// callback : function with n-arguments that is going to be called when the native code returned
call : function call(functionName, args, callback) {
//alert("call");
//alert('callback='+callback);
var hasCallback = callback && typeof callback == "function";
var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0;
if (hasCallback)
NativeBridge.callbacks[callbackId] = callback;
var iframe = document.createElement("IFRAME");
iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args)));
document.documentElement.appendChild(iframe);
iframe.parentNode.removeChild(iframe);
iframe = null;
},
log : function log(message) {
var iframe = document.createElement("IFRAME");
iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message)));
document.documentElement.appendChild(iframe);
iframe.parentNode.removeChild(iframe);
iframe = null;
}
};
we call connect() on javascript side on body load in html side
Once we receive callback(captureMovieCallback) from startCapture function, we send
start message indicating that we are ready to accept the data.
server on objective c side splits the wav audio data in small chunks of chunksize=60*1023
and stores in array.
sends the first block back to javascript side.
javascript accepts this block and sends the number of next block that it need from server.
server sends block indicated by this number. This process is repeated untill we
send the last block to javascript.
At the last we send stop message back to javascript side indicating that we are done. it
is apparently 3 bytes in size ( which is used as criteria to break this loop.)
Every block is stored as small blob in array. Now we create a bigger blobs from these
small blobs using following line
soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
This blob is uploaded to server which writes this blob as wav file.
we can pass url to this wav file as src of audio tag to replay it back on javascript side.
we close the websocket connection after sending blob to server.
Hope this is clear enough to understand.
If all you want to do is to play the sound than you'd be much better off using one of the native audio playback systems in iOS rather than the HTML audio tag.
I have a similar question here, but I thought I'd ask it a different way to cast a wider net. I haven't come across a workable solution yet (that I know of).
I'd like for XCode to issue a JavaScript command and get a return value back from an executeSql callback.
From the research that I've been reading, I can't issue a synchronous executeSql command. The closest I came was trying to Spin Lock until I got the callback. But that hasn't worked yet either. Maybe my spinning isn't giving the callback chance to come back (See code below).
Q: How can jQuery have an async=false argument when it comes to Ajax? Is there something different about XHR than there is about the executeSql command?
Here is my proof-of-concept so far: (Please don't laugh)
// First define any dom elements that are referenced more than once.
var dom = {};
dom.TestID = $('#TestID'); // <input id="TestID">
dom.msg = $('#msg'); // <div id="msg"></div>
window.dbo = openDatabase('POC','1.0','Proof-Of-Concept', 1024*1024); // 1MB
!function($, window, undefined) {
var Variables = {}; // Variables that are to be passed from one function to another.
Variables.Ready = new $.Deferred();
Variables.DropTableDeferred = new $.Deferred();
Variables.CreateTableDeferred = new $.Deferred();
window.dbo.transaction(function(myTrans) {
myTrans.executeSql(
'drop table Test;',
[],
Variables.DropTableDeferred.resolve()
// ,WebSqlError
);
});
$.when(Variables.DropTableDeferred).done(function() {
window.dbo.transaction(function(myTrans) {
myTrans.executeSql(
'CREATE TABLE IF NOT EXISTS Test'
+ '(TestID Integer NOT NULL PRIMARY KEY'
+ ',TestSort Int'
+ ');',
[],
Variables.CreateTableDeferred.resolve(),
WebSqlError
);
});
});
$.when(Variables.CreateTableDeferred).done(function() {
for (var i=0;i < 10;i++) {
myFunction(i);
};
Variables.Ready.resolve();
function myFunction(i) {
window.dbo.transaction(function(myTrans) {
myTrans.executeSql(
'INSERT INTO Test(TestID,TestSort) VALUES(?,?)',
[
i
,i+100000
]
,function() {}
,WebSqlError
)
});
};
});
$.when(Variables.Ready).done(function() {
$('#Save').removeAttr('disabled');
});
}(jQuery, window);
!function($, window, undefined) {
var Variables = {};
$(document).on('click','#Save',function() {
var local = {};
local.result = barcode.Scan(dom.TestID.val());
console.log(local.result);
});
var mySuccess = function(transaction, argument) {
var local = {};
for (local.i=0; local.i < argument.rows.length; local.i++) {
local.qry = argument.rows.item(local.i);
Variables.result = local.qry.TestSort;
}
Variables.Return = true;
};
var myError = function(transaction, argument) {
dom.msg.text(argument.message);
Variables.result = '';
Variables.Return = true;
}
var barcode = {};
barcode.Scan = function(argument) {
var local = {};
Variables.result = '';
Variables.Return = false;
window.dbo.transaction(function(myTrans) {
myTrans.executeSql(
'SELECT * FROM Test WHERE TestID=?'
,[argument]
,mySuccess
,myError
)
});
for (local.I = 0;local.I < 3; local.I++) { // Try a bunch of times.
if (Variables.Return) break; // Gets set in mySuccess and myError
SpinLock(250);
}
return Variables.result;
}
var SpinLock = function(milliseconds) {
var local = {};
local.StartTime = Date.now();
do {
} while (Date.now() < local.StartTime + milliseconds);
}
function WebSqlError(tx,result) {
if (dom.msg.text()) {
dom.msg.append('<br>');
}
dom.msg.append(result.message);
}
}(jQuery, window);
Is there something different about XHR than there is about the executeSql command?
Kind of.
How can jQuery have an async=false argument when it comes to Ajax?
Ajax, or rather XMLHttpRequest, isn't strictly limited to being asynchronous -- though, as the original acronym suggested, it is preferred.
jQuery.ajax()'s async option is tied to the boolean async argument of xhr.open():
void open(
DOMString method,
DOMString url,
optional boolean async, // <---
optional DOMString user,
optional DOMString password
);
The Web SQL Database spec does also define a Synchronous database API. However, it's only available to implementations of the WorkerUtils interface, defined primarily for Web Workers:
window.dbo = openDatabaseSync('POC','1.0','Proof-Of-Concept', 1024*1024);
var results;
window.dbo.transaction(function (trans) {
results = trans.executeSql('...');
});
If the environment running the script hasn't implemented this interface, then you're stuck with the asynchronous API and returning the result will not be feasible. You can't force blocking/waiting of asynchronous tasks for the reason you suspected:
Maybe my spinning isn't giving the callback chance to come back (See code below).
As I have found out there is no way to change the device brightness using Phonegap, so I have decided to create this plugin by my self. I am new to Phonegap and I do not know is it hard or not.
I have red some examples on how to create plugins for Phonegap But there are some things I don't understand.
I have this code for changing the screen brightness and I want to create a method for Phonegap that calls it:
private void setBrightness(int brightness) {
WindowManager.LayoutParams layoutParams = getWindow().getAttributes();
layoutParams.screenBrightness = brightness / 100.0f;
getWindow().setAttributes(layoutParams);
}
is it possible?
Thanks
Yeah, it's pretty easy to do if you follow the plugin development guide. For what you want to do it would be like this:
cordova.define("cordova/plugin/brightness",
function(require, exports, module) {
var exec = require("cordova/exec");
var Brightness = function () {};
var BrightnessError = function(code, message) {
this.code = code || null;
this.message = message || '';
};
Brightness.CALL_FAILED = 0;
Brightness.prototype.set = function(level,success,fail) {
exec(success,fail,"Brightness", "set",[level]);
};
var brightness = new Brightness();
module.exports = brightness;
});
Then you'll need to write some Java code to do the phone call. You'll need to create a new class that extends the Plugin class and write an execute method like this:
public PluginResult execute(String action, JSONArray args, String callbackId) {
PluginResult.Status status = PluginResult.Status.OK;
String result = "";
try {
if (action.equals("set")) {
int brightness = args.getInt(0);
WindowManager.LayoutParams layoutParams = getWindow().getAttributes();
layoutParams.screenBrightness = brightness / 100.0f;
this.cordova.getActivity().getWindow().setAttributes(layoutParams);
}
else {
status = PluginResult.Status.INVALID_ACTION;
}
return new PluginResult(status, result);
} catch (JSONException e) {
return new PluginResult(PluginResult.Status.JSON_EXCEPTION);
}
}
Whatever you call this class you'll need to add a line in the res/xml/config.xml file so the PluginManager can create it.
<plugin name="Brightness" value="org.apache.cordova.plugins.Brightness"/>
and finally in your JavaScript code you'll need to create they plugin and call it like this:
function panicButton() {
var brightness = cordova.require("cordova/plugin/brightness");
brightness.set(50);
}
That should about do it.