Can't use JSON.parse() to import array from PHP [duplicate] - javascript

This question already has answers here:
UTF-8 all the way through
(13 answers)
Closed 8 months ago.
So I've got a form where I can create multiple datalist-text-inputs of the same type that is later (onclick) put into an invisible input before submitted.
These datalist-text-inputs is created onload so that I can add more with an "+" button with the same function.
The options are imported from my PHP-database, and it worked fine. But suddenly it stopped working and I don't know why. I've tried a thousand things but can't figure it out.
I'm quite new to PHP. I think the problem has to do with JSON.parse() since the code breaks on that line.
script.js
var ajax = new XMLHttpRequest();
ajax.open("GET", "fetch data.php", true);
ajax.send();
ajax.onreadystatechange = function() {
if(this.readyState == 4 && this.status == 200) {
var data = JSON.parse(this.responseText);
var html = "";
for (var a = 0; a < data.length; a++) {
var firstName = data[a].name;
html += "<option value='" + firstName + "'></option>";
};
document.getElementById(type+"list"+addnumber).innerHTML = html;
};
};
type+"list"+addnumber it the name of the input-text-box. Type is an argument and addnumber an variable/integer.
fetch data.php
<?php
$host = "localhost"; $user = "root"; $pass = ""; $db = "collection";
$conn = mysqli_connect($host, $user, $pass, $db);
$result = mysqli_query($conn, 'SELECT name FROM musicians ORDER BY name');
$data = array();
while ($row = mysqli_fetch_assoc($result)) {
$data[]=$row;
};
echo json_encode($data);
?>
Also, I might add that this function creates objects in three places on the same page but the value is added/moved to three different invisible inputs.

Based on the exception you're seeing, ie:
JsonException: Malformed UTF-8 characters, possibly incorrectly encoded in C:\wamp64\www\collection\fetch data.php on line <i>11</i>
My guess is that the data you are reading is not encoded in a way which json_encode expects.
The simplest (NOT RECOMMENDED) approach is to pass in the JSON_INVALID_UTF8_IGNORE or JSON_INVALID_UTF8_SUBSTITUTE flags, which will cause bad data to be silently skipped over (or replaced with the unicode REPLACEMENT CHARACTER \0xfffd), rather than treating it as an error. See the documentation for json_encode and predefined JSON constants.
If you want to get to the root of the problem so that all data is correctly encoded as JSON, however:
You can try to force the encoding by setting the mysql character set using PHP's mysqli_set_charset function, eg:
<?php
$host = "localhost"; $user = "root"; $pass = ""; $db = "collection";
$conn = mysqli_connect($host, $user, $pass, $db);
if (!mysqli_set_charset($conn, 'utf8')) {
throw new \Exception("failed to set mysql charset");
}
$result = mysqli_query($conn, 'SELECT name FROM musicians ORDER BY name');
...
?>
The most common charsets in my experience are utf8, utf8mb4. If you know your data to contain some other specific character set, you may need to translate it into utf8 before trying to encode it into JSON using PHP's mb_convert_encoding function.
Finally, it could be that the issue has occurred earlier in your application, resulting in bad (mixed-encoding) data. If this is the case, you'll need to detect the bad data row-by-row, perhaps outputting where exceptions were raised in a separate error report, and manually correct the encoding of that data. This can be prevented by ensuring the data is validated and correctly encoded as utf8 before it reaches your database. Note that ensuring the mysql character set is correctly set for all connections is also potentially a part of this solution. It may be that you'll want to configure your database to do this automatically.
example of detection and logging:
<?php
$host = "localhost"; $user = "root"; $pass = ""; $db = "collection";
$conn = mysqli_connect($host, $user, $pass, $db);
if (!mysqli_set_charset($conn, 'utf8')) {
throw new \Exception("failed to set mysql charset");
}
$result = mysqli_query($conn, 'SELECT name FROM musicians ORDER BY name');
$data = array();
while ($row = mysqli_fetch_assoc($result)) {
try {
// detect rows which cannot be encoded
$discard = json_encode($row, JSON_THROW_ON_ERROR);
} catch (\JsonException $e) {
// keep track of failed rows so we can correct them later
my_custom_logging_function($row);
// and skip the row so we don't try to encode it later
continue;
}
$data[]=$row;
};
echo json_encode($data, JSON_THROW_ON_ERROR);
?>

OK just change "fetch data.php" to "data.php"

Related

Display JSON datas in column with PHP then return selection in the same format

I am currently working on a ticket reservation script (With date, time of reservation then quantity available).
I am trying to communicate with an API which is supposed to send me the data from the db in JSON format so that I can interpret it and display in PHP in a column in which I should display the day in the header of the column, the hour in each cells and a default quantity ( In my example we start from 60/60 which will be decremented by 1 when selecting the user.
For the moment, I'm just trying to manage to automatically create a column for each date, with the values ​​to select and the value of the remaining quantity (Knowing that a selection decrements by 1) then return the result in JSON format to the API to update the database and save the selection by assigning the user's IP to it.
I started a small script which retrieves the elements of the database in JSON format for the example, I think I have to create a foreach loop to create my columns but I'm stuck at the moment. Thank you for the leads or any help you think that you could bring me.
here's a picture of what i am trying to do:
enter image description here
<?php
try{
$pdo=new PDO(
'mysql:host=localhost;dbname=date_booking',
'root','',
array(PDO::MYSQL_ATTR_INIT_COMMAND => "SET NAMES utf8"));
} catch(PDOException $e){
echo $e->getMessage();
}
$statement=$pdo->prepare("SELECT * FROM tbl_booking");
$statement->execute();
$datas = array();
while($res=$statement->fetch(PDO::FETCH_ASSOC)) {
$datas[]=$res;
}
$someArray = $datas; // Replace ... with your PHP Array
foreach ($someArray as $key => $value) {
echo '<pre>';
echo $value["date_booking"]. ' | ' . $value["hour_booking"]. ' | ' . $value["nb_booking"];
echo '</pre>';
}
Start by selecting ONLY what you want from the table, then you can simply use fetchAll() to return the complete resultset as an array of arrays or objects, I used Objects in the below example.
It is then simple to make that into a JSON String to return to the caller using json_encode()
<?php
try{
$pdo=new PDO(
'mysql:host=localhost;dbname=date_booking',
'root','',
array(PDO::MYSQL_ATTR_INIT_COMMAND => "SET NAMES utf8"));
} catch(PDOException $e){
echo $e->getMessage();
}
$statement=$pdo->prepare("SELECT date_booking, hour_booking, nb_booking
FROM tbl_booking");
$statement->execute();
$rows = $statement->fetchAll(PDO::FETCH_OBJ);
echo json_encode($rows);
Then in you javascript you have an array of objects that you can place into your page however you want.

Get variable value by name from a String

This is for server side, regardless of client.
$data= file_get_contents('textfile.txt');
The textfile.txt contains
var obTemperature = "55";
var obIconCode = "01";
What can I enter so I can get echo obTemperature value of 55?
is there not a simple php interface to read var values by name?
please no over complicated /half answers /trolling,
You would be better off explaining what you want to do in general, but if you are tied to this file format and the format is consistent:
$data = str_replace('var ', '$', $data);
eval($data);
echo $obTemperature;
echo $obIconCode;
However, any other types of JavaScript code will cause a parse error.
Also, you can treat it as an ini file:
$data = str_replace('var ', '', parse_ini_string($data));
echo $data['obTemperature'];
Or just:
$data = parse_ini_string($data);
echo $data['var obTemperature'];
You can use a regular expression:
preg_match('/var obTemperature = "(\d+)";/', $data, $match);
$temperature = $match[1];
DEMO

Using PHP string in SQL query works when typed in but not when generated via javascript

$query = "
select * from tablename where userid = :uid AND date = :date
";
$query_params = array(
':uid' => $_SESSION['user']['uid'],
':date' => $date
);
try
{
$stmt = $db->prepare($query);
$result = $stmt->execute($query_params);
}
I am trying to run this simple SQL query via PHP. When I hard-code a string of a date like $date = '2016-07-29'; then I get the proper number of results. When I dynamically generate the same string using javascript like
$date = '<script> var currentdate = new Date();
document.write(currentdate.getFullYear()+"-"+
("0"+(currentdate.getMonth()+1)).slice(-2)+"-"+
("0"+currentdate.getDate()).slice(-2));
</script>';
then I get 0 results. Any ideas? Echoing $date in both instances produces the same result (type = string).

inserting integer variable into Database

How can I put this int value in our database? I've tried putting (int) and intval before $_POST but it still doesn't work? Everything else works except the conversion(?) of that int value so it can be placed in our database. Did we miss anything in the code? Thank you in advance.
function computeScoregrammar(){
// code for computing the score here
aver = 5;
<?php
//db connection here
$avegs = $_POST['aver'];
$qidScores = 4;
//below part is not yet complete for we are only trying to update a sample data in the database
if($qidScores == 4){
$qs = "UPDATE scores SET GrammarScore = '$avegs' WHERE applicantID = '$qidScores'";
mysqli_query($conn,$qs);
mysqli_close($conn);
}
else {
//else statement here
}
?>
Try it:
$qs = "UPDATE scores SET GrammarScore = $avegs WHERE applicantID = '$qidScores'";
$avegs without using ''. If you use '', $avegs will be treated as string.
WARNING: SQL injection, do not put this online! Use PDO or escaping (mysqli_escape_string()) and remove the ''.

Chunking MySQL array with json encode

I know there are existing some Questions about Chunking a mysql array in php, but my problem is, that I want to keep the output in JSON.
Scenario:
I want to get data from mysql, do some stuff with it ( like time formatting ) and output it in JSON.
The JSON data is parsed in the browser and visualized over a javascript chart.
Problem:
All of the above is working, but because of the huge amount of data, I'm getting an out of memory error, when I select bigger date ranges to output.
The Idea of directly sending out each x-lines of data is not working because of the JSON format it needs to be. Several JSON chunks won't work, it needs to be one for the chart.
So in the end I need to chunk the data but keep it as one big JSON.
(And setting up the memory limit is not really a solution.)
Ideas:
One Idea would be, to let the browser chunk the date range and ask the data as chunks & then put them together.
Of course this would work, but if there is a way to do this server side, it would be better.
Code:
private function getDB($date1, $date2){
$query = 'SELECT * FROM `db1`.`'.$table.'` WHERE `date` BETWEEN "'.$date1.'" AND "'.$date2.'" order by `date`;';
// date = datetime !
$result = $this->db->query($query);
$array = array();
while ( $row = $result->fetch_assoc () ) {
$array[] = array( strtotime($row[ 'date' ])*1000 , (float)$row[ 'var' ] );
// the formatting needs to be done, so the chart accepts it..
}
$result->close();
return json_encode($array);
}
Since this is not an option,
ini_set("memory_limit","32M")
perhaps you can add LIMIT to the function paramaters and query:
private function getDB($date1, $date2, $start, $pageSize){
$query = 'SELECT * FROM `db1`.`'.$table.'` WHERE `date` BETWEEN "'.$date1.'" AND "'.$date2.'" order by `date` LIMIT $start, $pageSize;';
// date = datetime !
$result = $this->db->query($query);
$array = array();
while ( $row = $result->fetch_assoc () ) {
$array[] = array( strtotime($row[ 'date' ])*1000 , (float)$row[ 'var' ] );
// the formatting needs to be done, so the chart accepts it..
}
$result->close();
return json_encode($array);
}
Then setup a for loop in javascript, call this with Ajax, incrementing the $start variable each time.
Store each responseText.substr(1).substr(-1) in an array.
When the responseText is "", all of the records have been returned.
.join the array with a comma, then add a new opening and closing "{ }", and you should have a JSON equivalent to all records.
Minimal parsing, and you'll be using built-in functions for most of it.
var startRec=0;
var pageSize=50;
var xmlhttp=new XMLHttpRequest();
var aryJSON=[];
var JSON;
xmlhttp.onreadystatechange=function(){
if (xmlhttp.readyState==4 && xmlhttp.status==200){
if(xmlhttp.responseText==""){ //Might need to check for "{}" here instead of ""
//All records are received
JSON="{" + aryJSON.join(",") + "}";
aryJSON=[];
startRec=0
}else{
aryJSON.push(xmlhttp.responseText.substr(1).substr(-1));
startRec+=pageSize;
getNextPage();
}
}
}
function getNextPage(){
xmlhttp.open("GET","your.php?start=" + startRec + "&pageSize=" + pageSize,true);
xmlhttp.send();
}
I would recommend that you have the server send the browser exactly what it needs to create the table. Parsing can be a heavy task, so why have the client do that lifting?
I would have your backend send the browser some kind of data structure that represents the table (i.e. list of lists), with all the formatting already done. Rendering the table should be faster and less memory-intensive.
One way of answer would be, to do the chunking on the server, by giving out the JSON, removing the leading [ & ].
#apache_setenv('no-gzip', 1);
#ini_set('zlib.output_compression', 0);
#ini_set('implicit_flush', 1);
$array = array();
echo '[';
$started = false;
while ( $row = $result->fetch_assoc () ) {
$array[] = [ strtotime($row[ 'datetime' ])*1000 , (float)$row[ 'var' ] ];
if(sizeof($array) == 1000){
if($started){
echo ',';
}else{
$started = true;
}
echo substr(substr(json_encode($array),1), 0, -1);
// converting [[datetime1, value1],[datetime2, value2]]
// to [datetime1, value1],[datetime2, value2]
ob_flush();
$array = array();
}
}
if($started)echo ',';
$this->flushJSON($array);
echo ']';
flush();
$result->close();
This is working and reducing the ram usage to 40%.
Still it seems that Apache is buffering something, so the ram usage increases over the time, the script is running. (Yeah, the flush is working, I debugged that, that's not the problem.)
But because of the remaining increase, the fastest way to achieve a clean chunking is to do this like alfadog67 pointed it out.
Also, to mention it, I had to disable the output compression, otherwise apache wouldn't flush it directly..

Categories