Passing Sound (wav) file to javascript from objective c - javascript

I am recording a sound file ( wav format) in objective C. I want to pass this back to Javascript using Objective C stringByEvaluatingJavaScriptFromString. I am thinking that I will have to convert wav file to base64 string to pass it to this function. Then I will have to convert base64 string back to (wav/blob) format in javascript to pass it to audio tag to play it. I don't know how can I do that? Also not sure if that is best way to pass wave file back to javascript? Any ideas will be appreciated.

well, this was not straight forward as I expected. so here is how I was able to achieve this.
Step 1: I recorded the audio in caf format using AudioRecorder.
NSArray *dirPaths;
NSString *docsDir;
dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
docsDir = [dirPaths objectAtIndex:0];
soundFilePath = [docsDir stringByAppendingPathComponent:#"sound.caf"];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:AVAudioQualityMin],
AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16],
AVEncoderBitRateKey,
[NSNumber numberWithInt:2],
AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100],
AVSampleRateKey,
nil];
NSError *error = nil;
audioRecorder = [[AVAudioRecorder alloc]
initWithURL:soundFileURL
settings:recordSettings error:&error];
if(error)
{
NSLog(#"error: %#", [error localizedDescription]);
} else {
[audioRecorder prepareToRecord];
}
after this, you just need to call audioRecorder.record to record the audio. it will be recorded
in caf format. If you want to see my recordAudio function, then here it is.
(void) recordAudio
{
if(!audioRecorder.recording)
{
_playButton.enabled = NO;
_recordButton.title = #"Stop";
[audioRecorder record];
[self animate1:nil finished:nil context:nil];
}
else
{
[_recordingImage stopAnimating];
[audioRecorder stop];
_playButton.enabled = YES;
_recordButton.title = #"Record";
}
}
Step 2: Convert the caf format to wav format. This I was able to perform using following function.
-(BOOL)exportAssetAsWaveFormat:(NSString*)filePath
{
NSError *error = nil ;
NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithFloat:44100.0], AVSampleRateKey,
[ NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey,
[ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[ NSData data], AVChannelLayoutKey, nil ];
NSString *audioFilePath = filePath;
AVURLAsset * URLAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil];
if (!URLAsset) return NO ;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error];
if (error) return NO;
NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio];
if (![tracks count]) return NO;
AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput
assetReaderAudioMixOutputWithAudioTracks:tracks
audioSettings :audioSetting];
if (![assetReader canAddOutput:audioMixOutput]) return NO ;
[assetReader addOutput :audioMixOutput];
if (![assetReader startReading]) return NO;
NSString *title = #"WavConverted";
NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docDir = [docDirs objectAtIndex: 0];
NSString *outPath = [[docDir stringByAppendingPathComponent :title]
stringByAppendingPathExtension:#"wav" ];
if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL])
{
return NO;
}
soundFilePath = outPath;
NSURL *outURL = [NSURL fileURLWithPath:outPath];
AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL
fileType:AVFileTypeWAVE
error:&error];
if (error) return NO;
AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio
outputSettings:audioSetting];
assetWriterInput. expectsMediaDataInRealTime = NO;
if (![assetWriter canAddInput:assetWriterInput]) return NO ;
[assetWriter addInput :assetWriterInput];
if (![assetWriter startWriting]) return NO;
//[assetReader retain];
//[assetWriter retain];
[assetWriter startSessionAtSourceTime:kCMTimeZero ];
dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL );
[assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{
NSLog(#"start");
while (1)
{
if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) {
CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer];
if (sampleBuffer) {
[assetWriterInput appendSampleBuffer :sampleBuffer];
CFRelease(sampleBuffer);
} else {
[assetWriterInput markAsFinished];
break;
}
}
}
[assetWriter finishWriting];
//[self playWavFile];
NSError *err;
NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err];
[self.audioDelegate doneRecording:audioData];
//[assetReader release ];
//[assetWriter release ];
NSLog(#"soundFilePath=%#",soundFilePath);
NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err];
NSLog(#"size of wav file = %#",[dict objectForKey:NSFileSize]);
//NSLog(#"finish");
}];
well in this function, i am calling audioDelegate function doneRecording with audioData which is
in wav format. Here is code for doneRecording.
-(void) doneRecording:(NSData *)contents
{
myContents = [[NSData dataWithData:contents] retain];
[self returnResult:alertCallbackId args:#"Recording Done.",nil];
}
// Call this function when you have results to send back to javascript callbacks
// callbackId : int comes from handleCall function
// args: list of objects to send to the javascript callback
- (void)returnResult:(int)callbackId args:(id)arg, ...;
{
if (callbackId==0) return;
va_list argsList;
NSMutableArray *resultArray = [[NSMutableArray alloc] init];
if(arg != nil){
[resultArray addObject:arg];
va_start(argsList, arg);
while((arg = va_arg(argsList, id)) != nil)
[resultArray addObject:arg];
va_end(argsList);
}
NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil];
[self performSelectorOnMainThread:#selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:#"NativeBridge.resultForCallback(%d,%#);",callbackId,resultArrayString] waitUntilDone:NO];
[resultArray release];
}
Step 3: Now it is time to communicate back to javascript inside UIWebView that we are done recording
the audio so you can start accepting data in blocks from us. I am using websockets to
transfer data back to javascript. The data will be transferred in blocks
because server(https://github.com/benlodotcom/BLWebSocketsServer) that I was using, was build using
libwebsockets(http://git.warmcat.com/cgi-bin/cgit/libwebsockets/).
This is how you start the server in delegate class.
- (id)initWithFrame:(CGRect)frame
{
if (self = [super initWithFrame:frame]) {
[self _createServer];
[self.server start];
myContents = [NSData data];
// Set delegate in order to "shouldStartLoadWithRequest" to be called
self.delegate = self;
// Set non-opaque in order to make "body{background-color:transparent}" working!
self.opaque = NO;
// Instanciate JSON parser library
json = [ SBJSON new ];
// load our html file
NSString *path = [[NSBundle mainBundle] pathForResource:#"webview-document" ofType:#"html"];
[self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]];
}
return self;
}
-(void) _createServer
{
/*Create a simple echo server*/
self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol];
[self.server setHandleRequestBlock:^NSData *(NSData *data) {
NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
NSLog(#"Received Request...%#",convertedString);
if([convertedString isEqualToString:#"start"])
{
NSLog(#"myContents size: %d",[myContents length]);
int contentSize = [myContents length];
int chunkSize = 64*1023;
chunksCount = ([myContents length]/(64*1023))+1;
NSLog(#"ChunkSize=%d",chunkSize);
NSLog(#"chunksCount=%d",chunksCount);
chunksArray = [[NSMutableArray array] retain];
int index = 0;
//NSRange chunkRange;
for(int i=1;i<=chunksCount;i++)
{
if(i==chunksCount)
{
NSRange chunkRange = {index,contentSize-index};
NSLog(#"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index);
NSData *dataChunk = [myContents subdataWithRange:chunkRange];
[chunksArray addObject:dataChunk];
break;
}
else
{
NSRange chunkRange = {index, chunkSize};
NSLog(#"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize);
NSData *dataChunk = [myContents subdataWithRange:chunkRange];
index += chunkSize;
[chunksArray addObject:dataChunk];
}
}
return [chunksArray objectAtIndex:0];
}
else
{
int chunkNumber = [convertedString intValue];
if(chunkNumber>0 && (chunkNumber+1)<=chunksCount)
{
return [chunksArray objectAtIndex:(chunkNumber)];
}
}
NSLog(#"Releasing Array");
[chunksArray release];
chunksCount = 0;
return [NSData dataWithBase64EncodedString:#"Stop"];
}];
}
code on javascript side is
var socket;
var chunkCount = 0;
var soundBlob, soundUrl;
var smallBlobs = new Array();
function captureMovieCallback(response)
{
if(socket)
{
try{
socket.send('start');
}
catch(e)
{
log('Socket is not valid object');
}
}
else
{
log('socket is null');
}
}
function closeSocket(response)
{
socket.close();
}
function connect(){
try{
window.WebSocket = window.WebSocket || window.MozWebSocket;
socket = new WebSocket('ws://127.0.0.1:9000',
'echo-protocol');
socket.onopen = function(){
}
socket.onmessage = function(e){
var data = e.data;
if(e.data instanceof ArrayBuffer)
{
log('its arrayBuffer');
}
else if(e.data instanceof Blob)
{
if(soundBlob)
log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size);
if(e.data.size != 3)
{
//log('its Blob of size = '+ e.data.size);
smallBlobs[chunkCount]= e.data;
chunkCount = chunkCount +1;
socket.send(''+chunkCount);
}
else
{
//alert('End Received');
try{
soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
var myURL = window.URL || window.webkitURL;
soundUrl = myURL.createObjectURL(soundBlob);
log('soundURL='+soundUrl);
}
catch(e)
{
log('Problem creating blob and url.');
}
try{
var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record';
var xhr = new XMLHttpRequest();
xhr.open('POST',serverUrl,true);
xhr.setRequestHeader("content-type","multipart/form-data");
xhr.send(soundBlob);
}
catch(e)
{
log('error uploading blob file');
}
socket.close();
}
//alert(JSON.stringify(msg, null, 4));
}
else
{
log('dont know');
}
}
socket.onclose = function(){
//message('<p class="event">Socket Status: '+socket.readyState+' (Closed)');
log('final blob size:'+soundBlob.size);
}
} catch(exception){
log('<p>Error: '+exception);
}
}
function log(msg) {
NativeBridge.log(msg);
}
function stopCapture() {
NativeBridge.call("stopMovie", null,null);
}
function startCapture() {
NativeBridge.call("captureMovie",null,captureMovieCallback);
}
NativeBridge.js
var NativeBridge = {
callbacksCount : 1,
callbacks : {},
// Automatically called by native layer when a result is available
resultForCallback : function resultForCallback(callbackId, resultArray) {
try {
var callback = NativeBridge.callbacks[callbackId];
if (!callback) return;
console.log("calling callback for "+callbackId);
callback.apply(null,resultArray);
} catch(e) {alert(e)}
},
// Use this in javascript to request native objective-c code
// functionName : string (I think the name is explicit :p)
// args : array of arguments
// callback : function with n-arguments that is going to be called when the native code returned
call : function call(functionName, args, callback) {
//alert("call");
//alert('callback='+callback);
var hasCallback = callback && typeof callback == "function";
var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0;
if (hasCallback)
NativeBridge.callbacks[callbackId] = callback;
var iframe = document.createElement("IFRAME");
iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args)));
document.documentElement.appendChild(iframe);
iframe.parentNode.removeChild(iframe);
iframe = null;
},
log : function log(message) {
var iframe = document.createElement("IFRAME");
iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message)));
document.documentElement.appendChild(iframe);
iframe.parentNode.removeChild(iframe);
iframe = null;
}
};
we call connect() on javascript side on body load in html side
Once we receive callback(captureMovieCallback) from startCapture function, we send
start message indicating that we are ready to accept the data.
server on objective c side splits the wav audio data in small chunks of chunksize=60*1023
and stores in array.
sends the first block back to javascript side.
javascript accepts this block and sends the number of next block that it need from server.
server sends block indicated by this number. This process is repeated untill we
send the last block to javascript.
At the last we send stop message back to javascript side indicating that we are done. it
is apparently 3 bytes in size ( which is used as criteria to break this loop.)
Every block is stored as small blob in array. Now we create a bigger blobs from these
small blobs using following line
soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" });
This blob is uploaded to server which writes this blob as wav file.
we can pass url to this wav file as src of audio tag to replay it back on javascript side.
we close the websocket connection after sending blob to server.
Hope this is clear enough to understand.

If all you want to do is to play the sound than you'd be much better off using one of the native audio playback systems in iOS rather than the HTML audio tag.

Related

How to generate an HTML div with the content of a Crystal Raport and pass it to javascript

I'm creating a pdf file with crystal report but I want to convert it to a HTML element instead so I can pass it to a Javascript function and print the element.
Here is the method that generates the pdf file:
[HttpGet]
[AuthorizeUser()]
public ActionResult RaportFatureDTvsh(long FaturaID, int kursi = 1, int koefic = 1, string ekran_printer = "")
{
FatureDPaTvsh fdtvsh = new FatureDPaTvsh();
string connstring = System.Configuration.ConfigurationManager.ConnectionStrings[GlobalVariables.DbName].ConnectionString;
var builder = new EntityConnectionStringBuilder(connstring);
var sqlconstring = builder.ProviderConnectionString;
var sqlbuilder = new SqlConnectionStringBuilder(sqlconstring);
fdtvsh.SetDatabaseLogon(sqlbuilder.UserID, sqlbuilder.Password, sqlbuilder.DataSource, sqlbuilder.InitialCatalog);
ConnectionInfo cReportConnection = new ConnectionInfo();
cReportConnection.DatabaseName = sqlbuilder.InitialCatalog;
cReportConnection.ServerName = sqlbuilder.DataSource;
cReportConnection.UserID = sqlbuilder.UserID;
cReportConnection.IntegratedSecurity = false;
cReportConnection.Password = sqlbuilder.Password;
Tables tables = fdtvsh.Database.Tables;
TableLogOnInfo crtablelogoninfo = new TableLogOnInfo();
foreach (Table table in tables)
{
cReportConnection.DatabaseName = sqlbuilder.InitialCatalog;
crtablelogoninfo.ConnectionInfo = cReportConnection;
table.ApplyLogOnInfo(crtablelogoninfo);
table.Location = table.Name;
}
fdtvsh.SetParameterValue(fdtvsh.Parameter_FaturaID.ParameterFieldName, FaturaID);
fdtvsh.SetParameterValue(fdtvsh.Parameter_kursi.ParameterFieldName, kursi);
fdtvsh.SetParameterValue(fdtvsh.Parameter_koefic.ParameterFieldName, koefic);
FormulaFieldDefinition foField = fdtvsh.DataDefinition.FormulaFields["txtshenime"];
string shenime = db.Ndermarrja.Select(x => x.shenime).FirstOrDefault();
shenime = Regex.Replace(shenime, #"\r\n?|\n", "'+Chr(10)+ '");
//PictureObject picObject = (PictureObject)fdtvsh.ReportDefinition.ReportObjects["Picture1"];
foField.Text = string.Format("'{0}'", shenime);
string picPath = db.Ndermarrja.Select(x => x.logopath).FirstOrDefault();
if (!string.IsNullOrEmpty(picPath))
{
FormulaFieldDefinition picturePathFormula = fdtvsh.DataDefinition.FormulaFields["picturePath"];
picturePathFormula.Text = string.Format("'{0}'", picPath);
}
if (ekran_printer == "Printer" || ekran_printer == "")
{
Stream s = fdtvsh.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat);
return File(s, "application/pdf", "FatureA4.pdf");
}
else
{
Stream s = fdtvsh.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat);
return File(s, "application/pdf");
}
}
I need to make changes to this part of code to make it so it doesn't create a pdf file but it puts its content in a HTML element instead.
I tried changing :
Stream s = fdtvsh.ExportToStream(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat);
return File(s, "application/pdf", "FatureA4.pdf");
with :
Stream s = fdtvsh.ExportToStream(CrystalDecisions.Shared.ExportFormatType.HTML40);
return File(s, "text/html", "FatureA4.html");
but then I got this error :
System.IO.FileNotFoundException
HResult=0x80070002
Message=Could not find file 'C:\Users\User\AppData\Local\Temp\cr_export_bae61988-381d-4496-a2d5-f3c281786bdb.htm'.
Source=<Cannot evaluate the exception source>
StackTrace:
<Cannot evaluate the exception stack trace>
You should save the pdf file on your server and then download it to the client side after it returns to view with an ajax request to a ActionResult method in your controller by using :
HttpResponse Response = System.Web.HttpContext.Current.Response;
Repsonse.ClearContent();
Repsonse.Clear();
Response.ContentType = "application/pdf";
Response.AddHeader("Content-Disposition", "attachment; filename=" + yourFileName + ";");
Repsonse.TransmitFile(your files full path);
Repsonse.Flush();
Repsonse.End();

App Extension (Action Extension) doesn’t open

For some reason that I don't understand, the Action Extension Button (in Share menu) doesn't respond. Action extension, at this point, catches the URL from Safari (where it was launched from) to make some things after. As a layer between Web and extension there is JS file (maybe something wrong here, i just copied it)
ViewController:
class ActionViewController: UIViewController {
var SafariURL: NSURL!
override func viewDidLoad() {
super.viewDidLoad()
let extensionItem = extensionContext?.inputItems.first as? NSExtensionItem
let itemProvider = extensionItem!.attachments?.first as? NSItemProvider
let propertyList = String(kUTTypePropertyList)
if itemProvider!.hasItemConformingToTypeIdentifier(propertyList) {
print("I'm here2")
itemProvider!.loadItem(forTypeIdentifier: propertyList, options: nil, completionHandler: { (item, error) -> Void in
let dictionary = item as? NSDictionary
OperationQueue.main.addOperation {
let results = dictionary![NSExtensionJavaScriptPreprocessingResultsKey] as? NSDictionary
let urlString = results!["currentUrl"] as? String
self.SafariURL = NSURL(string: urlString!)
}
})
} else {
print("error")
}
}
#IBAction func done() {
// Return any edited content to the host app.
// This template doesn't do anything, so we just echo the passed in items.
self.extensionContext!.completeRequest(returningItems: self.extensionContext!.inputItems, completionHandler: nil)
}
JS File:
var GetURL = function() {};
GetURL.prototype = {
run: function(arguments) {
arguments.completionFunction({ "currentUrl" : document.URL });
},
finalize: function(arguments) {
var message = arguments["statusMessage"];
if (message) {
alert(message);
}
}
};
var ExtensionPreprocessingJS = new GetURL;
Finally, you should change a content of override func viewDidLoad to
super.viewDidLoad()
if let inputItem = extensionContext?.inputItems.first as? NSExtensionItem {
if let itemProvider = inputItem.attachments?.first {
itemProvider.loadItem(forTypeIdentifier: kUTTypePropertyList as String) { [self] (dict, error) in
guard let itemDictionary = dict as? NSDictionary else { return }
guard let javaScriptValues = itemDictionary[NSExtensionJavaScriptPreprocessingResultsKey] as? NSDictionary else { return }
self.Pageurl = javaScriptValues["URL"] as? String ?? ""
JS is ok!

WKUserContentController: subsequent add/remove/add of script message handler results in the handler not being called from JavaScript

tl;dr
do
WKUserContentController* ucc = self.webView.configuration.userContentController;
[ucc addScriptMessageHandler:self name:#"serverTimeCallbackHandler"]; // add
[ucc removeScriptMessageHandlerForName:#"serverTimeCallbackHandler"]; // remove
[ucc addScriptMessageHandler:self name:#"serverTimeCallbackHandler"]; // add
...and window.webkit.messageHandlers.serverTimeCallbackHandler.postMessage(<messageBody>) will not be called from JavaScript.
description
I faced with issue while using WebKit.framework. It is either framework's fault or inappropriate usage of it.
I have a .js file which interacts with a webpage. At the end of its interaction, the script calls window.webkit.messageHandlers.<name>.postMessage(<messageBody>)
(function() {
var timeZoneCoockieName = 'cTz';
function getServerTimeWithTimeZone() {
PageMethods.GetServerTime(function(time) {
var serverTime = time;
var timezone = getCookie(timeZoneCoockieName);
var message = {'time': serverTime, 'timezone': timezone};
window.webkit.messageHandlers.serverTimeCallbackHandler.postMessage(message);
});
}
function getCookie(name) {
var value = "; " + document.cookie;
var parts = value.split("; " + name + "=");
if (parts.length == 2) return parts.pop().split(";").shift();
}
getServerTimeWithTimeZone();
})();
Basically, the script calls a page method, to retrieve time from the server and looks up for time zone stored in cookies upon the callback.
The native component is designed to wrap this call and expose convenient API to retrieve date/time zone info.
- (void)getServerTimeWithCompletion:((^)(id, NSError *))completion {
// Adding same message handler twice, leads to an exception.
// Removing message handler with the same name, just in case.
[self.webView.configuration.userContentController removeScriptMessageHandlerForName:#"serverTimeCallbackHandler"];
[self.webView.configuration.userContentController addScriptMessageHandler:self name:#"serverTimeCallbackHandler"];
self.getServerTimeCompleteBlock = completion;
NSString * jsPath = [[NSBundle mainBundle] pathForResource:#"GetServerTime" ofType:#"js"];
if (![jsPath isBlank]) {
NSError* encodingError = nil;
NSString* jsContent = [NSString stringWithContentsOfFile:jsPath
encoding:NSUTF8StringEncoding
error:&encodingError];
if (!encodingError) {
[self.webView evaluateJavaScript:jsContent completionHandler: ^(id _Nullable result, NSError * _Nullable evalError) {
if (evalError) {
if (completion) { completion(nil, evalError); }
}
}];
} else {
if (completion) { completion(nil, encodingError); }
}
}
else {
if (completion) { completion(nil, nil); }
}
}
Now, if this method is called twice by a user, second time the handler is not called from JavaScript.
Web Debugger attached to the app's web view, logs the error:
InvalidAccessError: DOM Exception 15: A parameter or an operation was not supported by the underlying object.

Facebook Javascript API call to me/invitable_friends returns only 25 results on cordova but not on web

I'm developing a game on cordova that uses facebook integration. I have a facebook game canvas running on a secure site.
The friend request works fine on the web site version (returns more than 25 results, as I'm iterating the paging.next url that is also returned).
However, on the cordova build (android) it only ever returns the first result set of 25. It does still have the page.next url JSON field but it just returns a response object with a type=website.
Has anyone else come across this?
After quite a lot of digging I found an issue with the way requests are handled in the FacebookLib for Android. The current version of the com.phonegap.plugins.facebookconnect plugin uses Android FacebookSDK 3.21.1 so I'm not sure if this will still be an issue with v4.
A graph result with a paging url is used to request the next page however using the entire url, which includes the https://graph.facebook.com/ as well as the usual graphAction causes an incorrect result set to be returned. However I determined that if you remove the schema and host parts it will be correct.
I modified the ConnectPlugin.java to check that any schema and host is removed from the graphAction. Seems to work well now.
ConnectPlugin.java before:
private void makeGraphCall() {
Session session = Session.getActiveSession();
Request.Callback graphCallback = new Request.Callback() {
#Override
public void onCompleted(Response response) {
if (graphContext != null) {
if (response.getError() != null) {
graphContext.error(getFacebookRequestErrorResponse(response.getError()));
} else {
GraphObject graphObject = response.getGraphObject();
JSONObject innerObject = graphObject.getInnerJSONObject();
graphContext.success(innerObject);
}
graphPath = null;
graphContext = null;
}
}
};
//If you're using the paging URLs they will be URLEncoded, let's decode them.
try {
graphPath = URLDecoder.decode(graphPath, "UTF-8");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
String[] urlParts = graphPath.split("\\?");
String graphAction = urlParts[0];
Request graphRequest = Request.newGraphPathRequest(null, graphAction, graphCallback);
Bundle params = graphRequest.getParameters();
if (urlParts.length > 1) {
String[] queries = urlParts[1].split("&");
for (String query : queries) {
int splitPoint = query.indexOf("=");
if (splitPoint > 0) {
String key = query.substring(0, splitPoint);
String value = query.substring(splitPoint + 1, query.length());
params.putString(key, value);
if (key.equals("access_token")) {
if (value.equals(session.getAccessToken())) {
Log.d(TAG, "access_token URL: " + value);
Log.d(TAG, "access_token SESSION: " + session.getAccessToken());
}
}
}
}
}
params.putString("access_token", session.getAccessToken());
graphRequest.setParameters(params);
graphRequest.executeAsync();
}
ConnectPlugin.java after:
private void makeGraphCall() {
Session session = Session.getActiveSession();
Request.Callback graphCallback = new Request.Callback() {
#Override
public void onCompleted(Response response) {
if (graphContext != null) {
if (response.getError() != null) {
graphContext.error(getFacebookRequestErrorResponse(response.getError()));
} else {
GraphObject graphObject = response.getGraphObject();
JSONObject innerObject = graphObject.getInnerJSONObject();
graphContext.success(innerObject);
}
graphPath = null;
graphContext = null;
}
}
};
//If you're using the paging URLs they will be URLEncoded, let's decode them.
try {
graphPath = URLDecoder.decode(graphPath, "UTF-8");
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
String[] urlParts = graphPath.split("\\?");
String graphAction = urlParts[0];
///////////////////////
// SECTION ADDED
///////////////////////
final String GRAPH_BASE_URL = "https://graph.facebook.com/";
if(graphAction.indexOf(GRAPH_BASE_URL)==0) {
URL graphUrl = null;
try {
graphUrl = new URL(graphAction);
} catch (MalformedURLException e) {
e.printStackTrace();
}
graphAction = graphUrl.getPath();
}
///////////////////////
// END SECTION ADDED
///////////////////////
Request graphRequest = Request.newGraphPathRequest(null, graphAction, graphCallback);
Bundle params = graphRequest.getParameters();
if (urlParts.length > 1) {
String[] queries = urlParts[1].split("&");
for (String query : queries) {
int splitPoint = query.indexOf("=");
if (splitPoint > 0) {
String key = query.substring(0, splitPoint);
String value = query.substring(splitPoint + 1, query.length());
params.putString(key, value);
if (key.equals("access_token")) {
if (value.equals(session.getAccessToken())) {
Log.d(TAG, "access_token URL: " + value);
Log.d(TAG, "access_token SESSION: " + session.getAccessToken());
}
}
}
}
}
params.putString("access_token", session.getAccessToken());
graphRequest.setParameters(params);
graphRequest.executeAsync();
}
There's no way to know that you call their api from cordova vs website, so it's some problem on your side, maybe you use some different implementation of the api on corodva and website, so that cordova sends a pagination request or send to other api version which does pagination.

How to implement a phonegap call back to javascript

I have the following code perfectly working, except ... well for the call back !
- (void)readBarcode:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options
{
ZBarReaderViewController *reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
ZBarImageScanner *scanner = reader.scanner;
[scanner setSymbology: ZBAR_EAN13
config: ZBAR_CFG_ENABLE
to: 1];
[[super appViewController] presentModalViewController:reader animated:YES];
[reader release];
}
(void) imagePickerController:(UIImagePickerController*)reader didFinishPickingMediaWithInfo: (NSDictionary*) info
{
id<NSFastEnumeration> results = [info objectForKey: ZBarReaderControllerResults];
ZBarSymbol *symbol = nil;
for(symbol in results)
break;
resultText.text = symbol.data;
resultImage.image = [info objectForKey: UIImagePickerControllerOriginalImage];
NSString* retStr = [[NSString alloc]
initWithFormat:#"%#({ code: '%#', image: '%#' });",
resultText.text,resultImage.image];
[ webView stringByEvaluatingJavaScriptFromString:retStr ];
[reader dismissModalViewControllerAnimated: YES];
}
I then call the function from javascript :
function getIt(){
PhoneGap.exec("BarcodeReader.readBarcode", "myCallback");
}
Problem is that I don't understand how to call the 'myCallBack' function from c# (admit i'm a total newbie)
This should work...
Add property to header file ( How To Add A Property In Objective C )
-(NSString *) jsCallback;
Get the javascript callback method and set the property
- (void)readBarcode:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options
{
ZBarReaderViewController *reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
// New Property added !!!!
NSString * jsCallback = [info objectAtIndex:0];
ZBarImageScanner *scanner = reader.scanner;
[scanner setSymbology: ZBAR_EAN13
config: ZBAR_CFG_ENABLE
to: 1];
[[super appViewController] presentModalViewController:reader animated:YES];
[reader release];
}
Use the javascript callback method here
- (void) imagePickerController:(UIImagePickerController*)reader didFinishPickingMediaWithInfo: (NSDictionary*) info
{
id<NSFastEnumeration> results = [info objectForKey: ZBarReaderControllerResults];
ZBarSymbol *symbol = nil;
for(symbol in results)
break;
resultText.text = symbol.data;
resultImage.image = [info objectForKey: UIImagePickerControllerOriginalImage];
// create the string
NSString* retStr = [[NSString alloc]
initWithFormat:#"%#({ code: '%#', image: '%#' });",
jsCallback,resultText.text,resultImage.image];
//execute
[ webView stringByEvaluatingJavaScriptFromString:retStr ];
[reader dismissModalViewControllerAnimated: YES];
}
Please mark the other answer I provided you as correct also

Categories