playing 40.000 unit army for game [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
SOLVED!! (look at my last edit)
I want to make an army fight of 20.000 vs 20.000 units on canvas. So, for every unit data is:
{
'id' => 17854,
'x' => 1488,
'y' => 1269,
'team' => 'red',
'health' => 10,
'target' => [1486, 1271]
}
And i would to see this fight in real time (25 frames per second).
If i generate 1 frame with Json and save on file, it is 2.5mb size (40k such units with this data).
1 second(25 frames) = 62.5 mb file size. And the fight could last about 30 minutes, so it should use 112gb. This is bad. If i would make the file as binary data, it should take 27 times less place, or 4gb for 30 mins. That's still bad. 2 hrs movie takes 700mb.
I need to save many fights on server. Real players should make their armies and fight each other, so i need the fights to be saved. Every fight is unique, because every damage of every unit is random 0~10 and after unit kills an enemy, it regenerates 1 health. I'm doing calculations with PHP and saving files on server. All 40.000 units are on one screen, all can be visible at once, i want it like that. For now, units are single 2x2 pixel cubes, red and blue teams, its easy to load for javascript.
But, to generate file with PHP i need about 1 hour. That's another problem. Because for every frame i need to iterate these 40.000 units (for updating x/y, searching for nearby enemies or friends, then for doing damage and returning target or killed enemy coordinates), then iterate over again to unset killed units, and before putting all that into file, i need to iterate over everything and remove unused data that was used for calculations. And to finish this 30 mins fight i need to repeat it 45000 times. Also, every minute there are less and less units. But my point is somehow to make all that file generating in less than a minute, just need a logical way, if some exist.
Questions:
1) What is the best way to save my files on server and make the files much less on size?
(so far is to use binary data and compress to zip)
2) what is the fastest way to calculate my fights?
(so far is to compile with C++)
// Edited. Here is my whole game code, just like that :)
This is main action:
class Simulator
{
private $units;
private $places = [];
private $oldPlaces = [];
public function initiateGame() {
$this->createUnits();
$this->startMoving();
}
private function createUnits() {
foreach(range(0, 150) as $column) { // i like exact army formation to look nice, so its 150x140=21000 units
foreach (range(0, 140) as $row) {
$this->setUnits($column, $row);
}
}
$this->oldPlaces = $this->places;
}
private function setUnits($column, $row) {
$beginning_of_team_A = 6; //starting point on canvas for A unit to look nice on screen
$unit_size_and_free_place = 6; //unit size= 3x3 (look js), and free place between each unit is 3 pixels.
$beginning_of_team_B = 1100; // next side where enemy army starts appearing
$x_a = $beginning_of_team_A + $column * $unit_size_and_free_place; // team A
$y = $beginning_of_team_A + $row * $unit_size_and_free_place; // same for both teams
$unitA = new Unit($x_a, $y, 1); // 1 is team A (it goes always +1 pixel every frame)
$this->units[] = $unitA;
$x_b = $beginning_of_team_B + $column * $unit_size_and_free_place; // team B
$unitB = new Unit($x_b, $y, -1); // -1 is team B (it goes always -1 pixel every frame)
$this->units[] = $unitB;
$this->places[$x_a.','.$y] = 1; // now that way tracking units, and calculating their next move
$this->places[$x_b.','.$y] = -2;
}
private function startMoving() {
set_time_limit(30000); // by default after 1 minute it throws exception
foreach(range(0, 400) as $frame) { //giving 400 frames is like 400/40=10 seconds of action
$this->places = [];
foreach($this->units as $unit) {
$returned = $unit->move($this->oldPlaces); //giving whole units list to every unit to look forward
$this->places[$returned[0]] = $returned[1]; // returns (next x/y position as string ['1514,148'] ) = ( id as int [15] )
}
file_put_contents('assets/games/'.$frame.'.json', json_encode($this->units)); // writing into file every frame and it uses ~2mb
$this->oldPlaces = $this->places; //resetting old positions
}
}
}
This is unit:
class Unit
{
public $x = 0;
public $y = 0;
public $team = 1;
public $stopped;
public function __construct($x, $y, $team, $stopped = false) {
$this->x = $x;
$this->y = $y;
$this->team = $team;
$this->stopped = $stopped;
}
public function move($places) {
$this->checkForward($places);
return [$this->x.','.$this->y, $this->team];
}
private function checkForward($places) {
$forward = $this->x + $this->team; // TODO: find out formula to replace the 4 ifs
$forward1 = $this->x + $this->team*2;
$forward2 = $this->x + $this->team*3;
$forward3 = $this->x + $this->team*4;
if(isset($places[$forward.','.$this->y])) {
$this->stopped = true;
} else if (isset($places[$forward1.','.$this->y])) {
$this->stopped = true;
} else if (isset($places[$forward2.','.$this->y])) {
$this->stopped = true;
} else if (isset($places[$forward3.','.$this->y])) {
$this->stopped = true;
} else {
$this->stopped = false;
}
if($this->stopped == false) { // move forward it is not stopped
$this->x = $this->x + $this->team;
}
}
}
This is js:
var app = angular.module('app', []);
app.controller('game', function($scope, $http, $interval) {
var canvas = document.getElementById("game"),
context = canvas.getContext("2d");
var frame = -2;
$scope.attacking = false;
var units = [];
function start_animation_loop() {
$scope.promise = $interval(function() {
if($scope.attacking == true) {
frame ++;
if(frame >= 0) {
downloadFile();
animate();
}
}
}, 40 );
}
function downloadFile() {
$http.get('assets/games/'+frame+'.json').success(function(response) {
units = response;
});
}
function animate() {
clear_canvas();
draw();
}
function clear_canvas() {
context.clearRect(0, 0, 1800, 912);
}
function draw() {
for(var a=0; a<units.length; a++) {
context.beginPath();
context.fillRect(units[a]['x'], units[a]['y'], 3, 3);
if(units[a]['team'] == 1) {
context.fillStyle = 'red';
} else {
context.fillStyle = 'blue';
}
}
}
start_animation_loop();
});
SOLVED! Thx to my colleague in my work! He gave me brilliant idea!
To get the result i needed i just need for every next battle to generate random number (0~10000) and place it into single MySQL database. In addition also put there formations, units, their starting strength, health and everything else.
And all the calculations do with javascript:
With one constant number (given from backend) i make a formula to always reproduce same army fight -> every unit will move forwards and they always stop at the same time anyways no matter what the calculation. The uniqueness is the random damage every unit gives and what process after. And all the damage will be just their "x/y position somehow compared to constant number" and do whatever to get single damage, random for every unit because they all are in different map positions, but damage will always be 0~10. With the same constant number, all units will always do the same damage after calculation and always will move same at every replay, die and do same damage at every replay. All the hardest work will be on javascript - to make calculations using this constant number.
My random number can be any. If first fight i generate random number "17", and next battle i generate random number "19666516546", it will not mean that battle with number "17" will do less damage - they all will do the "random" damage 0~15 to every unit, but replays with same formations, unit numbers, starting position and this random generated number will be always the same -> no more need to save any files! And i can add various spec effects, add something like defences, evasions, and all will fit in two MySQL rows - for every team :) Cool!!

id can be implicit in the storage medium. Sure, that means you have to save gaps, but you can compress said gaps.
'x' => 1488,
'y' => 1269,
depending on the canvas size, this can be compressed. If the canvas is 1e6 x 1e6 (a million by a million), there are 1e12 locations, which fits in ~40 bits.
'team' => 'red',
with 2 sides, this is 1 bit.
'health' => 10,
the vast majority of units have a low health. So what we can do is that units with a health < 15 are stored in 4 bits. If all the bits are set, we have to look up the units health elsewhere (with an id->health table).
'target' => [1486, 1271]
We could store an independent target for each unit, but that probably doesn't match how the UI works. You probably select a pile of units, and tell them to go somewhere, no? ~40 bits for a location, ~24 bits for a reference count, for 8 bytes per target.
If we give each side a limit of ~65k targets, that is 16 bits.
16+4+1+40 = 61 bits. Which means we have 3 more bits to play with to pack them into a 64 bit per unit.
At 64 bits per unit, that is 160k per side. Plus up to half-a-meg of target data, but that can be handled dynamically.
Plus a health overflow table (that maps id to health), which should usually be close to empty. If you do this sometimes, you can set up before-after id maps to maintain a consistent history (say, when half the units are dead, you do a compression pass with a before-after id mapping).
If id need not be consistent, you can compress the units down so that they are no longer sparse.
The target trick -- the target could be a unit ID if less than 40,000, and if above that value it would be a waypoint. That reduces your waypoints to ~15k ones. (I was assuming target was set by the UI, where someone would select a chunk of units, an order them to go somewhere).
You'd want to iterate over the packed data, skipping "dead" units (bitmask on health), unpacking them into a usable structure, evaluating their action, and writing them back. Double-buffering is an option (where you read from one buffer, and write to another), but it has a modest cost. Because units are fixed size, you can lookup other units fast (and unpack them), which helps if your targets are other unit ids, and with things like doing damage.
Single-buffering makes things easier, because things like simultaneous damage are tricky. It does mean that lower-id units act first -- you can fix this by flipping a coin each turn to determine if you iterate forward or backwards (and stick one side in low-ids, the other in high-ids), or make that an initiative check at the start of combat.

Working on the PHP side of things here is some initialization material to set up red and blue teams randomly. You'll notice that the data being stored is in the form of human readable strings so it is relatively easy to see what is going on just by looking at the data.
$red = array(); // stats per red unit
$blue = array(); // stats per blue unit
$sites = array(); // units (red and/or blue) at a location
$start = microtime(true);
echo "Generating red team:<br>\n";
for ($r=0; $r<20000; $r++)
{
$x = mt_rand(0,1940/2);
$y = mt_rand(0,1280);
$h = 10;
$red[$r] = "r:$r;x:$x;y:$y;h:$h";
if ( $r < 3 )
echo "... " . $red[$r] . "<br>\n";
if ( $r == 3 )
echo "...<br>\n";
if ( $r >= 19997 )
echo "... " . $red[$r] . "<br>\n";
#$sites[$x][$y][] = "r:$r";
}
$now = microtime(true);
echo "Red side generated, total time used " . ($now - $start) . "<br><br>\n";
echo "Generating blue team:<br>\n";
for ($b=0; $b<20000; $b++)
{
$x = mt_rand(1940/2,1940);
$y = mt_rand(0,1280);
$h = 10;
$blue[$b] = "b:$b;x:$x;y:$y;h:$h\n";
if ( $b < 3 )
echo "... " . $blue[$b] . "<br>\n";
if ( $b == 3 )
echo "...<br>\n";
if ( $b >= 19997 )
echo "... " . $blue[$b] . "<br>\n";
#$sites[$x][$y][] = "b:$b";
}
$now = microtime(true);
echo "Blue side generated, total time used " . ($now - $start) . "<br><br>\n";
$sum = 0;
foreach ($sites as $x => $list)
$sum += count($list);
echo "$sum screen locations contain one or more units<br>\n";
The output from this shows that we can generate 40,000 units at random locations and then track all units by screen x,y location, or site, for later use within a very short period of time.
Generating red team:
... r:0;x:49;y:642;h:10
... r:1;x:508;y:1162;h:10
... r:2;x:444;y:8;h:10
...
... r:19997;x:553;y:851;h:10
... r:19998;x:608;y:414;h:10
... r:19999;x:860;y:1203;h:10
Red side generated, total time used 0.070003986358643
Generating blue team:
... b:0;x:1799;y:445;h:10
... b:1;x:1913;y:177;h:10
... b:2;x:1730;y:678;h:10
...
... b:19997;x:1586;y:919;h:10
... b:19998;x:1445;y:3;h:10
... b:19999;x:1061;y:542;h:10
Blue side generated, total time used 0.14700794219971
39697 screen locations contain one or more units
In terms of file storage, I would suggest something similar to that suggested by others. You would probably write the full initialization state to file and then list adjustments to that state on a frame by frame basis. Your PHP code would then adjust state to generate display effects by reading changes on a frame by frame basis.
Before I continue I wonder if you could reply with some information about what you mean by targeting. Is this movement destination with combat occurring automatically when nearby? Also, why do you look for and find nearby friend and enemy units? Is this to adjust combat effects or to generate movement actions?
The following function demonstrates locating nearby units. Notice that the blue and the red are randomly place on opposite sides of the screen by the above so you aren't likely to see any nearby foes.
function nearby($unit, $sites, $withinx=5, $withiny=5, $verbose=0)
{
if ( $verbose )
echo "Looking for units near unit '$unit'<br>\n";
$data = explode(';',$unit);
foreach ($data as $datum)
{
if ( strncmp($datum,"x:",2)==0 )
$x = (integer)substr($datum,2);
if ( strncmp($datum,"y:",2)==0 )
$y = (integer)substr($datum,2);
}
if ( $verbose )
echo "... this unit is located at ($x,$y)<br>\n";
$nearby = array();
for ($sx = $x - $withinx; $sx <= $x + $withinx; $sx++)
{
for ($sy = $y - $withiny; $sy <= $y + $withiny; $sy++)
{
$list = #$sites[$sx][$sy];
if ( count($list) )
{
foreach ($list as $key => $candidate)
{
if ( strncmp($candidate,$unit,strlen($candidate))==0 )
continue;
$nearby[] = $candidate;
if ( $verbose )
echo "... ... unit at $sx,$sy found: $candidate<br>\n";
}
}
}
}
return $nearby;
}
Then, some code to run several thousand closeness checks with verbose mode turned on for the first few.
echo "<br>\n";
for ($i=0; $i<1000; $i++)
{
$r = mt_rand(0,19999);
$verbose = 0;
if ( $i < 2 )
$verbose = 1;
nearby($red[$r],$sites,5,5,$verbose);
}
echo "1000 red units randomly checked for nearby units<br>\n";
echo "<br>\n";
for ($i=0; $i<1000; $i++)
{
$r = mt_rand(0,19999);
$verbose = 0;
if ( $i < 2 )
$verbose = 1;
nearby($blue[$r],$sites,5,5,$verbose);
}
echo "1000 blue units randomly checked for nearby units<br>\n";
$now = microtime(true);
echo "<br>Total time used " . ( $now - $start) . "<br>\n";
Sample output from the search for nearby units follows:
Looking for units near unit 'r:16452;x:332;y:944;h:10'
... this unit is located at (332,944)
... ... unit at 335,945 found: r:3376
... ... unit at 336,948 found: r:14128
Looking for units near unit 'r:4414;x:3;y:1223;h:10'
... this unit is located at (3,1223)
... ... unit at 2,1219 found: r:1210
... ... unit at 8,1226 found: r:461
1000 red units randomly checked for nearby units
Looking for units near unit 'b:4002;x:1531;y:224;h:10 '
... this unit is located at (1531,224)
... ... unit at 1530,222 found: b:11267
Looking for units near unit 'b:3006;x:1011;y:349;h:10 '
... this unit is located at (1011,349)
1000 blue units randomly checked for nearby units
Total time used 0.56303095817566 (including generation, above)
Best Way to Save Files on Server
No matter how you do this you'll end up accumulating serious data if you want to store large numbers of games. The data you save can be reduced to initial state details followed by updates to that state data. You could end up with something similar to the following:
[Frame 0]
Game State
[Frame 1]
State Changes
...
[Frame X]
Last State
[Post Game]
Wrap up details (scores, etc)
For longer term storage you can compress the game state data and zap it up to Amazon's S3 storage system (or something similar). Playback will only involve generating frames quick enough based on the stream of state changes.
If you do store rendering information you might find that certain region, or strips, of the game board end up empty and unchanging for long periods of time. This too can reduce the data flowing between your unit state engine and your rendering engine.
Fastest Way to Calculate Battle
The language you use to generate or calculate the battle does not have to be the same language as your rendering engine. For example, you could have your rendering engine requesting frames from a separate state management engine. As long as the interfaces stay the same you could upgrade battle calculation, state replay and frame data generation systems as needed.
If time and interest permit I'd go with PHP at first just to get the kinks out of the system -- then invest in whatever portion gives you the biggest bang next. However, I like the challenge of pushing PHP and trying to get "fast" from a language some people assume is slow.
Don't hesitate to store additional state details (that are not written out) during the game. For example, above, I've stored the list of units keyed by the X,Y location on the screen. This attempts to provide a quick path for determining which other units are close to a unit. Anything that takes a long time to iterate through might be better served by a maintained data set that can be looked up via key.
As for the calculation itself, I'm going to suggest that you consider ways to reduce the amount of effort that is happening. Though you have 40,000 units on screen you don't have to process every one of them every frame. You could instead, for example, process all of them per 10 frames while adjusting the scope of action per unit processed to keep the pace of the game the same.
If needed, you can provide rules and reasons for behavior that reduces your processing load. For example, if a unit moved last frame it is not ready to fight until the +Nth frame. If a unit was wounded it must wait until the +Nth frame, healing 1 in the process, before continuing. To adjust for these processing friendly delays you could increase movement rates and damage rates.
Again, don't confuse the size of the battle processing state system with the amount of information that must be saved to replay the game. The state system can load up on data used to make processing more efficient without saving that data anywhere as long as it can be regenerated during playback.

To limit the size, you're probably best off not using JSON. From the structure you gave, it seems that each of the elements can fit in two bytes, meaning that each unit datum will only take up around 12 bytes or so. Since you're saving to disk, the cost of serialization and deserialization is not important compared to the cost of disk I/O.
I'd recommend using a binary format for this:
2 bytes for id
2 bytes for x
2 bytes for y
1 byte for color
1 byte for health
4 bytes for target
which is 12 bytes long. To do this is slightly clunky in Javascript, but can be done by using the bitwise operators to pull bytes out of a string, for instance:
var ustr = <get unit string somehow>
var unit = {};
unit.id = ustr.charCodeAt(0);
unit.x = ustr.charCodeAt(1);
unit.y = ustr.charCodeAt(2);
var teamAndHealth = ustr.charCodeAt(3);
unit.team = teams[teamAndHealth >>> 8];
unit.health = teamAndHealth & 0xFF;
unit.target = [ustr.charCodeAt(4), ustr.charCodeAt(5)];
As for generating the data, PHP is probably not your best choice either. I'd recommend using a language such as C#, Perl, or Java, which can all write binary data quite easily, and have automatic garbage collection to make your life easier.

Related

Web Audio API creating a Peak Meter with AnalyserNode

What is the correct way to implement a Peak Meter like those in Logic Pro with the Web Audio API AnalyserNode?
I know AnalyserNode.getFloatFrequencyData() returns decibel values, but how do you combine those values to get the one to be displayed in the meter? Do you just take the maximum value like in the following code sample (where analyserData comes from getFloatFrequencyData():
let peak = -Infinity;
for (let i = 0; i < analyserData.length; i++) {
const x = analyserData[i];
if (x > peak) {
peak = x;
}
}
Inspecting some output from just taking the max makes it look like this is not the correct approach. Am I wrong?
Alternatively, would it be a better idea to use a ScriptProcessorNode instead? How would that approach differ?
If you take the maximum of getFloatFrequencyData()'s results in one frame, then what you are measuring is the audio power at a single frequency (whichever one has the most power). What you actually want to measure is the peak at any frequency — in other words, you want to not use the frequency data, but the unprocessed samples not separated into frequency bins.
The catch is that you'll have to compute the decibels power yourself. This is fairly simple arithmetic: you take some number of samples (one or more), square them, and average them. Note that even a “peak” meter may be doing averaging — just on a much shorter time scale.
Here's a complete example. (Warning: produces sound.)
document.getElementById('start').addEventListener('click', () => {
const context = new(window.AudioContext || window.webkitAudioContext)();
const oscillator = context.createOscillator();
oscillator.type = 'square';
oscillator.frequency.value = 440;
oscillator.start();
const gain1 = context.createGain();
const analyser = context.createAnalyser();
// Reduce output level to not hurt your ears.
const gain2 = context.createGain();
gain2.gain.value = 0.01;
oscillator.connect(gain1);
gain1.connect(analyser);
analyser.connect(gain2);
gain2.connect(context.destination);
function displayNumber(id, value) {
const meter = document.getElementById(id + '-level');
const text = document.getElementById(id + '-level-text');
text.textContent = value.toFixed(2);
meter.value = isFinite(value) ? value : meter.min;
}
// Time domain samples are always provided with the count of
// fftSize even though there is no FFT involved.
// (Note that fftSize can only have particular values, not an
// arbitrary integer.)
analyser.fftSize = 2048;
const sampleBuffer = new Float32Array(analyser.fftSize);
function loop() {
// Vary power of input to analyser. Linear in amplitude, so
// nonlinear in dB power.
gain1.gain.value = 0.5 * (1 + Math.sin(Date.now() / 4e2));
analyser.getFloatTimeDomainData(sampleBuffer);
// Compute average power over the interval.
let sumOfSquares = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
sumOfSquares += sampleBuffer[i] ** 2;
}
const avgPowerDecibels = 10 * Math.log10(sumOfSquares / sampleBuffer.length);
// Compute peak instantaneous power over the interval.
let peakInstantaneousPower = 0;
for (let i = 0; i < sampleBuffer.length; i++) {
const power = sampleBuffer[i] ** 2;
peakInstantaneousPower = Math.max(power, peakInstantaneousPower);
}
const peakInstantaneousPowerDecibels = 10 * Math.log10(peakInstantaneousPower);
// Note that you should then add or subtract as appropriate to
// get the _reference level_ suitable for your application.
// Display value.
displayNumber('avg', avgPowerDecibels);
displayNumber('inst', peakInstantaneousPowerDecibels);
requestAnimationFrame(loop);
}
loop();
});
<button id="start">Start</button>
<p>
Short average
<meter id="avg-level" min="-100" max="10" value="-100"></meter>
<span id="avg-level-text">—</span> dB
</p>
<p>
Instantaneous
<meter id="inst-level" min="-100" max="10" value="-100"></meter>
<span id="inst-level-text">—</span> dB
</p>
Do you just take the maximum value
For a peak meter, yes. For a VU meter, there's all sorts of considerations in measuring the power, as well as the ballistics of an analog meter. There's also RMS power metering.
In digital land, you'll find a peak meter to be most useful for many tasks, and by far the easiest to compute.
A peak for any given set of samples is the highest absolute value in the set. First though, you need that set of samples. If you call getFloatFrequencyData(), you're not getting sample values, you're getting the spectrum. What you want instead is getFloatTimeDomainData(). This data is a low resolution representation of the samples. That is, you might have 4096 samples in your window, but your analyser might be configured with 256 buckets... so those 4096 samples will be resampled down to 256 samples. This is generally acceptable for a metering task.
From there, it's just Math.max(-Math.min(samples), Math.max(samples)) to get the max of the absolute value.
Suppose you wanted a higher resolution peak meter. For that, you need all the raw samples you can get. That's where a ScriptProcessorNode comes in handy. You get access to the actual sample data.
Basically, for this task, AnalyserNode is much faster, but slightly lower resolution. ScriptProcessorNode is much slower, but slightly higher resolution.

JS Minification: Compressing the code

A online tool, such as JSCompress, will reduce code size up to 80%. It's easy to notice that the result, compressed code, removes space. Beyond the removal of EOL and ' ' characters, is there any other trickery needed to minify a js file?
Example compressed:
function glow(e){$("#"+e).fadeIn(700,function(){$(this).fadeOut(700)})}function startLevel(){ptrn=[],pos=0,setLevel(lvl),$("#mg-lvl").fadeOut("slow",function(){$("#mg-contain").prop("onclick",null).off("click"),$("#mg-contain").css("cursor","default"),$(this).text("Level "+lvl+": "+ptrn.length+" blink(s)."),$(this).fadeIn("slow"),showLevel(0)})}function setLevel(e){ptrn.push(Math.floor(3*Math.random()+1)),0==e||setLevel(--e)}function showLevel(e){$("#b"+ptrn[e]+"c").fadeOut(speed,function(){$("#ball_"+ptrn[e]).fadeOut(speed,function(){$("#b"+ptrn[e]+"c").fadeIn(speed),$(this).fadeIn(speed,function(){e+1<ptrn.length&&showLevel(++e,speed)})})}),e+1==ptrn.length&&setTimeout(bindKeys(1),ptrn.length*speed+15)}function bindKeys(e){for(var e=1;e<4;e++)bind(e)}function bind(e){$("#ball_"+e).on("click",function(){$("#b"+e+"c").fadeOut(speed,function(){$("#ball_"+e).fadeOut(speed,function(){$("#ball_"+e).fadeIn(speed),$("#b"+e+"c").fadeIn(speed),referee(e)&&unbind()})})})}function referee(e){if(pos<ptrn.length&&(e===ptrn[pos]?$("#mg-score").text(parseInt($("#mg-score").text())+1):end()),++pos==ptrn.length)return++lvl,speed-=40,!0}function unbind(){for(var e=1;e<4;e++)$("#ball_"+e).off();startLevel()}function nestedFade(e,n,t){e[n]&&$(e[n]).fadeOut("fast",function(){t[n]&&($(e),t[n]),nestedFade(e,++n,t)})}function end(){for(var e=[],n=[],t=1;t<4;t++)e.push("#b"+t+"c"),e.push("#ball_"+t),n.push(null);e.push("#mg-contain"),n.push('.fadeOut("slow")'),e.push("#mg-obj"),n.push(".fadeOut('slow')"),e.push("#bg-ball-container"),n.push(".toggle()"),nestedFade(e,0,n)}var ptrn=[],pos=0,lvl=1,speed=400,b1=setInterval(function(){glow("ball_1b",700)}),b2=setInterval(function(){glow("ball_2b",700)}),b3=setInterval(function(){glow("ball_3b",700)});
Example uncompressed:
var ptrn = [];
var pos = 0;
var lvl = 1;
var speed = 400;
/* make balls glow */
function glow(id)
{
$('#'+id).fadeIn(700, function(){$(this).fadeOut(700);})
}
var b1 = setInterval(function(){ glow('ball_1b',700) ,1500});
var b2 = setInterval(function(){ glow('ball_2b',700) ,1500});
var b3 = setInterval(function(){ glow('ball_3b',700) ,1500});
/* end */
function startLevel()
{
ptrn = [];
pos = 0;
/* set pattern for the level */
setLevel(lvl);
/* display prompt for level */
$('#mg-lvl').fadeOut("slow", function(){
$('#mg-contain').prop('onclick',null).off('click');
$('#mg-contain').css('cursor','default');
$(this).text("Level " + lvl + ": " + ptrn.length + " blink(s).");
$(this).fadeIn('slow');
/* play back the pattern for user to play */
showLevel(0); //TODO: use promise and deferred pattern to pull this out of fade function.
});
}
function setLevel(lvl)
{
ptrn.push(Math.floor((Math.random() * 3) + 1));
(lvl == 0 ) ? null : setLevel(--lvl);
}
function showLevel(i)
{
/* blink the balls */
$('#b'+ptrn[i]+'c').fadeOut(speed, function(){
$('#ball_'+ptrn[i]).fadeOut(speed, function(){
$('#b'+ptrn[i]+'c').fadeIn(speed);
$(this).fadeIn(speed, function(){
if(i+1<ptrn.length)
showLevel(++i,speed);
});
});
});
if( (i+1) == ptrn.length)
setTimeout( bindKeys(1), ptrn.length*speed+15) //after the pattern is revealed bind the clicker
}
function bindKeys(i)
{
for(var i=1;i<4;i++)
bind(i);
}
function bind(i)
{
$('#ball_'+i).on('click', function() {
$('#b'+i+'c').fadeOut(speed, function() {
$('#ball_'+i).fadeOut(speed, function() {
$('#ball_'+i).fadeIn(speed);
$('#b'+i+'c').fadeIn(speed);
if(referee(i))
unbind();
});
});
});
}
function referee(val)
{
if(pos < ptrn.length){
( val === ptrn[pos] ) ? $('#mg-score').text(parseInt($('#mg-score').text())+1) : end();
}
if(++pos == ptrn.length)
{
++lvl;
speed-=40;
return true;
}
}
function unbind()
{
for(var i=1;i<4;i++)
$( "#ball_"+i).off();
startLevel();
}
function nestedFade(id,i,func)
{
(!id[i]) ? 0 : $(id[i]).fadeOut('fast',function(){ if(func[i])
{$(id)+func[i];};nestedFade(id,++i,func);})
}
function end()
{
var id = [];
var func = [];
for(var i=1;i<4;i++){
id.push('#b'+i+'c');
id.push('#ball_'+i);
func.push(null)
}
id.push('#mg-contain');
func.push('.fadeOut("slow")');
id.push('#mg-obj');
func.push(".fadeOut('slow')");
id.push('#bg-ball-container');
func.push(".toggle()");
nestedFade(id,0,func);
}
Saves 32% on file size...and if that is the case, is it a fair assumption then that writing less is doing more for the end user?
The same way you can 'minify' a file to reduce its size, you can also 'uglify' a file, which takes your code and shortens things like variable names to the same end: reduce file size by reducing the number of characters in it (not just removing line breaks and space characters).
While it will reduce loadtime for a user, it's not a great practice to write minified/uglified-style code off the bat. That's why in almost any professional build/deploy process, you take your clear, descriptive code and then run your build processes to reduce the size of your files and eventually deploy versions that your end user will have a quicker time loading. You can always write your regular code, then run a compression process like the one you described, save it into a "public" folder and upload that for users to have access to, rather than your fleshed out code.
All a minifier will do is remove white space, which like you said, is ' ' and EOL characters. I believe you may be thinking of file compression tools such as a .zip file with the way your question is worded. Such file types (.zip) will find common strings in your file, and put references to the original string rather than having it written out 10 times. Meaning if the string "I like cake" shows up 4 times in your file, it will have "I like cake" in one location, and the other three locations will reference that first location, shortening the length of the file and therefore decreasing its size.
Well the main reason JS, CSS and HTML get's minified is to decrease the size of the files transmitted from server to client when a client requests a webpage. This decrease in size will allow for a faster load time. So technically writing less is more for a webpages load time, but realistically the effect of you as a developer consciously writing shorter code to minimize file size will either a.) Be to minimal a change to actually make a difference or b.) lead to loss of functionality or bugs due to the focus being on cutting down code length, not code quality.

javascript, getting a random number result in each singular array

I'm not sure how to word the question and i'm still quite new at javascript.
So I've got a random quote generator that has each quote result as an array. I'd like to add in two items in the array which I've got so far but having one result be a random number generated eg "2 quote" but having 2 be randomised each time. The end result is for a browser based text game. So it could be "2 zombies attack" or "7 zombies attack." The code I have so far is:
var quotes = [
[x, 'Zombies attack!'],
[x, 'other creatures attack'],
['next line'],
]
function newQuote() {
var randomNumber = Math.floor(Math.random() * (quotes.length));
document.getElementById('quote').innerHTML = quotes[randomNumber];
}
Ideally need x(or i however it's going to work) to be the result of a random number between a set range, each differently each array.
Thank you
p.s I forgot to mention that not all the quotes require a number. Thats why I've done it as a double array.
If I understand your goal correctly, you want to have a set of similar-ish message templates, pick one of them at some point and fill it with data, correct? There's a lot of ways to tackle this problem, depending on how varying can your templates be. For a simple case in my head where you just need to prepend a number to a string I'd do something like this:
var messages = [" zombies attack",
" other creatures attack"], // define your messages
messageIndex = Math.floor(Math.random() * messages.length), // pick one of them
numberOfMonsters = Math.floor(Math.random() * 10 + 1), // get your random number
result = numberOfMonsters + messages[messageIndex]; // construct a resulting message
document.getElementById('quote').textContent = result;
If you'd rather have more complex strings where you don't necessarily add a number (or any string) to the beginning, like ["There's X things in the distance", "X things are somewhere close"], then I'd recommend to either come up with some sort of string formatting of your own or use a library to do that for you. sprintf.js seems to be just right for that, it will let you do things like this:
var messages = ["%d zombies attack",
"A boss with %d minions attacks"], // define your messages
messageIndex = Math.floor(Math.random() * messages.length), // pick one of them
numberOfMonsters = Math.floor(Math.random() * 10 + 1), // get your random number
result = sprintf(messages[messageIndex], numberOfMonsters) // format a final message
document.getElementById('quote').textContent = result;
EDIT: Your task is much more complex than what is described in the original question. You need to think about you code and data organization. You have to outline what is finite and can be enumerated (types of actions are finite: you can loot, fight, move, etc.), and what is arbitrary and dynamic (list of monsters and loot table are arbitrary, you have no idea what type and amount of monsters game designers will come up with). After you've defined your structure you can come up with some quick and dirty message composer, which takes arbitrary entities and puts them into finite amount of contexts, or something. Again, I'm sort of shooting in the dark here, but here's an updated version of the code on plunkr.
I solved it to do what I want and still have the numbers different. The issue was I should have had the number generator within the quote function. Also can create multiple variables to use too for different number generators. The plan is to then integrate it with php to add content dynamically. Which I can do. Thanks Dmitry for guiding me in the right direction.
function newQuote() {
var MonsterOne = Math.floor((Math.random() * 14) + 0);
var MonsterTwo = Math.floor((Math.random() * 14) + 0);
var MonsterThree = Math.floor((Math.random() * 14) + 0);
var MonsterFour = Math.floor((Math.random() * 14) + 0);
var quotes = [
['Test', MonsterOne, 'One'],
['Test', MonsterOne,'Two'],
['Test', MonsterThree, 'Three'],
[MonsterFour, 'Four'],
['Five'],
]
var randomNumber = Math.floor(Math.random() * (quotes.length));
document.getElementById('quote').innerHTML = quotes[randomNumber];
}

jStorage limit in chrome? [duplicate]

Since localStorage (currently) only supports strings as values, and in order to do that the objects need to be stringified (stored as JSON-string) before they can be stored, is there a defined limitation regarding the length of the values.
Does anyone know if there is a definition which applies to all browsers?
Quoting from the Wikipedia article on Web Storage:
Web storage can be viewed simplistically as an improvement on cookies, providing much greater storage capacity (10 MB per origin in Google Chrome(https://plus.google.com/u/0/+FrancoisBeaufort/posts/S5Q9HqDB8bh), Mozilla Firefox, and Opera; 10 MB per storage area in Internet Explorer) and better programmatic interfaces.
And also quoting from a John Resig article [posted January 2007]:
Storage Space
It is implied that, with DOM Storage,
you have considerably more storage
space than the typical user agent
limitations imposed upon Cookies.
However, the amount that is provided
is not defined in the specification,
nor is it meaningfully broadcast by
the user agent.
If you look at the Mozilla source code
we can see that 5120KB is the default
storage size for an entire domain.
This gives you considerably more space
to work with than a typical 2KB
cookie.
However, the size of this storage area
can be customized by the user (so a
5MB storage area is not guaranteed,
nor is it implied) and the user agent
(Opera, for example, may only provide
3MB - but only time will tell.)
Actually Opera doesn't have 5MB limit. It offers to increase limit as applications requires more. User can even choose "Unlimited storage" for a domain.
You can easily test localStorage limits/quota yourself.
Here's a straightforward script for finding out the limit:
if (localStorage && !localStorage.getItem('size')) {
var i = 0;
try {
// Test up to 10 MB
for (i = 250; i <= 10000; i += 250) {
localStorage.setItem('test', new Array((i * 1024) + 1).join('a'));
}
} catch (e) {
localStorage.removeItem('test');
localStorage.setItem('size', i - 250);
}
}
Here's the gist, JSFiddle and blog post.
The script will test setting increasingly larger strings of text until the browser throws and exception. At that point it’ll clear out the test data and set a size key in localStorage storing the size in kilobytes.
Find the maximum length of a single string that can be stored in localStorage
This snippet will find the maximum length of a String that can be stored in localStorage per domain.
//Clear localStorage
for (var item in localStorage) delete localStorage[item];
window.result = window.result || document.getElementById('result');
result.textContent = 'Test running…';
//Start test
//Defer running so DOM can be updated with "test running" message
setTimeout(function () {
//Variables
var low = 0,
high = 2e9,
half;
//Two billion may be a little low as a starting point, so increase if necessary
while (canStore(high)) high *= 2;
//Keep refining until low and high are equal
while (low !== high) {
half = Math.floor((high - low) / 2 + low);
//Check if we can't scale down any further
if (low === half || high === half) {
console.info(low, high, half);
//Set low to the maximum possible amount that can be stored
low = canStore(high) ? high : low;
high = low;
break;
}
//Check if the maximum storage is no higher than half
if (storageMaxBetween(low, half)) {
high = half;
//The only other possibility is that it's higher than half but not higher than "high"
} else {
low = half + 1;
}
}
//Show the result we found!
result.innerHTML = 'The maximum length of a string that can be stored in localStorage is <strong>' + low + '</strong> characters.';
//Functions
function canStore(strLen) {
try {
delete localStorage.foo;
localStorage.foo = Array(strLen + 1).join('A');
return true;
} catch (ex) {
return false;
}
}
function storageMaxBetween(low, high) {
return canStore(low) && !canStore(high);
}
}, 0);
<h1>LocalStorage single value max length test</h1>
<div id='result'>Please enable JavaScript</div>
Note that the length of a string is limited in JavaScript; if you want to view the maximum amount of data that can be stored in localStorage when not limited to a single string, you can use the code in this answer.
Edit: Stack Snippets don't support localStorage, so here is a link to JSFiddle.
Results
Chrome (45.0.2454.101): 5242878 characters
Firefox (40.0.1): 5242883 characters
Internet Explorer (11.0.9600.18036): 16386 122066 122070 characters
I get different results on each run in Internet Explorer.
Don't assume 5MB is available - localStorage capacity varies by browser, with 2.5MB, 5MB and unlimited being the most common values.
Source: http://dev-test.nemikor.com/web-storage/support-test/
I wrote this simple code that is testing localStorage size in bytes.
https://github.com/gkucmierz/Test-of-localStorage-limits-quota
const check = bytes => {
try {
localStorage.clear();
localStorage.setItem('a', '0'.repeat(bytes));
localStorage.clear();
return true;
} catch(e) {
localStorage.clear();
return false;
}
};
Github pages:
https://gkucmierz.github.io/Test-of-localStorage-limits-quota/
I have the same results on desktop Google chrome, opera, firefox, brave and mobile chrome which is ~10Mbytes
And half smaller result in safari ~4Mbytes
You don't want to stringify large objects into a single localStorage entry. That would be very inefficient - the whole thing would have to be parsed and re-encoded every time some slight detail changes. Also, JSON can't handle multiple cross references within an object structure and wipes out a lot of details, e.g. the constructor, non-numerical properties of arrays, what's in a sparse entry, etc.
Instead, you can use Rhaboo. It stores large objects using lots of localStorage entries so you can make small changes quickly. The restored objects are much more accurate copies of the saved ones and the API is incredibly simple. E.g.:
var store = Rhaboo.persistent('Some name');
store.write('count', store.count ? store.count+1 : 1);
store.write('somethingfancy', {
one: ['man', 'went'],
2: 'mow',
went: [ 2, { mow: ['a', 'meadow' ] }, {} ]
});
store.somethingfancy.went[1].mow.write(1, 'lawn');
BTW, I wrote it.
I've condensed a binary test into this function that I use:
function getStorageTotalSize(upperLimit/*in bytes*/) {
var store = localStorage, testkey = "$_test"; // (NOTE: Test key is part of the storage!!! It should also be an even number of characters)
var test = function (_size) { try { store.removeItem(testkey); store.setItem(testkey, new Array(_size + 1).join('0')); } catch (_ex) { return false; } return true; }
var backup = {};
for (var i = 0, n = store.length; i < n; ++i) backup[store.key(i)] = store.getItem(store.key(i));
store.clear(); // (you could iterate over the items and backup first then restore later)
var low = 0, high = 1, _upperLimit = (upperLimit || 1024 * 1024 * 1024) / 2, upperTest = true;
while ((upperTest = test(high)) && high < _upperLimit) { low = high; high *= 2; }
if (!upperTest) {
var half = ~~((high - low + 1) / 2); // (~~ is a faster Math.floor())
high -= half;
while (half > 0) high += (half = ~~(half / 2)) * (test(high) ? 1 : -1);
high = testkey.length + high;
}
if (high > _upperLimit) high = _upperLimit;
store.removeItem(testkey);
for (var p in backup) store.setItem(p, backup[p]);
return high * 2; // (*2 because of Unicode storage)
}
It also backs up the contents before testing, then restores them.
How it works: It doubles the size until the limit is reached or the test fails. It then stores half the distance between low and high and subtracts/adds a half of the half each time (subtract on failure and add on success); honing into the proper value.
upperLimit is 1GB by default, and just limits how far upwards to scan exponentially before starting the binary search. I doubt this will even need to be changed, but I'm always thinking ahead. ;)
On Chrome:
> getStorageTotalSize();
> 10485762
> 10485762/2
> 5242881
> localStorage.setItem("a", new Array(5242880).join("0")) // works
> localStorage.setItem("a", new Array(5242881).join("0")) // fails ('a' takes one spot [2 bytes])
IE11, Edge, and FireFox also report the same max size (10485762 bytes).
You can use the following code in modern browsers to efficiently check the storage quota (total & used) in real-time:
if ('storage' in navigator && 'estimate' in navigator.storage) {
navigator.storage.estimate()
.then(estimate => {
console.log("Usage (in Bytes): ", estimate.usage,
", Total Quota (in Bytes): ", estimate.quota);
});
}
I'm doing the following:
getLocalStorageSizeLimit = function () {
var maxLength = Math.pow(2,24);
var preLength = 0;
var hugeString = "0";
var testString;
var keyName = "testingLengthKey";
//2^24 = 16777216 should be enough to all browsers
testString = (new Array(Math.pow(2, 24))).join("X");
while (maxLength !== preLength) {
try {
localStorage.setItem(keyName, testString);
preLength = testString.length;
maxLength = Math.ceil(preLength + ((hugeString.length - preLength) / 2));
testString = hugeString.substr(0, maxLength);
} catch (e) {
hugeString = testString;
maxLength = Math.floor(testString.length - (testString.length - preLength) / 2);
testString = hugeString.substr(0, maxLength);
}
}
localStorage.removeItem(keyName);
// Original used this.storageObject in place of localStorage. I can only guess the goal is to check the size of the localStorage with everything but the testString given that maxLength is then added.
maxLength = JSON.stringify(localStorage).length + maxLength + keyName.length - 2;
return maxLength;
};
I really like cdmckay's answer, but it does not really look good to check the size in a real time: it is just too slow (2 seconds for me). This is the improved version, which is way faster and more exact, also with an option to choose how big the error can be (default 250,000, the smaller error is - the longer the calculation is):
function getLocalStorageMaxSize(error) {
if (localStorage) {
var max = 10 * 1024 * 1024,
i = 64,
string1024 = '',
string = '',
// generate a random key
testKey = 'size-test-' + Math.random().toString(),
minimalFound = 0,
error = error || 25e4;
// fill a string with 1024 symbols / bytes
while (i--) string1024 += 1e16;
i = max / 1024;
// fill a string with 'max' amount of symbols / bytes
while (i--) string += string1024;
i = max;
// binary search implementation
while (i > 1) {
try {
localStorage.setItem(testKey, string.substr(0, i));
localStorage.removeItem(testKey);
if (minimalFound < i - error) {
minimalFound = i;
i = i * 1.5;
}
else break;
} catch (e) {
localStorage.removeItem(testKey);
i = minimalFound + (i - minimalFound) / 2;
}
}
return minimalFound;
}
}
To test:
console.log(getLocalStorageMaxSize()); // takes .3s
console.log(getLocalStorageMaxSize(.1)); // takes 2s, but way more exact
This works dramatically faster for the standard error; also it can be much more exact when necessary.
Once I developed Chrome (desktop browser) extension and tested Local Storage real max size for this reason.
My results:
Ubuntu 18.04.1 LTS (64-bit)
Chrome 71.0.3578.98 (Official Build) (64-bit)
Local Storage content size 10240 KB (10 MB)
More than 10240 KB usage returned me the error:
Uncaught DOMException: Failed to execute 'setItem' on 'Storage': Setting the value of 'notes' exceeded the quota.
Edit on Oct 23, 2020
For a Chrome extensions available chrome.storage API. If you declare the "storage" permission in manifest.js:
{
"name": "My extension",
...
"permissions": ["storage"],
...
}
You can access it like this:
chrome.storage.local.QUOTA_BYTES // 5242880 (in bytes)

Generating a unique ID in Javascript

I want to implement a saving system similar to Imgur where if a user presses a button a unique 5 character value is returned. Here is what I have so far:
The database backend uses auto-incrementing ID's starting at 5308416. I use a modified Radix function (see below) to convert these numerical ID's into characters. I use a reverse function to lookup character ID's back to numerical database ID's.
function genID (value)
{
var alphabet = "23456789BCDFGHJKLMNPRSTVWXYZbcdfghjkmnpqrstvwxyz";
var result = "";
var length = alphabet.length;
while (value > 0)
{
result = alphabet[value % length] + result;
value = Math.floor (value / length);
}
return result;
}
The problem is that these generated ID's are very much predictable. My question is, how can I make the generated ID's seem random but still unique (so I can look them up in the database as numbers). I was thinking of using some encryption algorithm but not sure where to start. Any help or suggestions would be appreciated (maybe there is a better way of doing this also).
Do you have to be able to go both ways (i.e. convert an integer to it's hash and back again)? If you can store the hash and lookup the content that way, then it's relatively easy to create a function that produces a hard-to-guess, but complete hash space. You use primes to generate a sequence that only repeats once all possible permutations are exhausted.
The following PHP example is from my own code, adapted from this site:
function hash($len = 6) {
$base = 36;
$gp = array(1,23,809,28837,1038073,37370257 /*,1345328833*/);
$maxlen = count($gp);
$len = $len > ($maxlen-1) ? ($maxlen-1) : $len;
while($len < $maxlen && pow($base,$len) < $this->ID) $len++;
if($len >= $maxlen) throw new Exception($this->ID." out of range (max ".pow($base,$maxlen-1).")");
$ceil = pow($base,$len);
$prime = $gp[$len];
$dechash = ($this->ID * $prime) % $ceil;
$hash = base_convert($dechash, 10, $base);
return str_pad($hash, $len, "0", STR_PAD_LEFT);
}
It would be easy enough to implement that in JavaScript, but ideally you wouldn't need too - you'd have an insert trigger on your table that populated a hash field with the result of that algorithm (adapted for SQL, of course).
A non-predictable, but unique ID can be made by combining your server-side auto-incrementing number with either a current date/time nugget or with a random number. The server-side auto-incrementing number guarantees uniqueness and the date/time nugget or random number removes the predictability.
For a unique ID in string form that takes the server-side unique number as input and where you add the date/time nugget on the client you can do this:
function genID(serverNum) {
return(serverNum + "" + (new Date).getTime());
}
Or using a random number:
function genID(serverNum) {
return(serverNum + "" + Math.floor(Math.random() * 100000));
}
But, it might be best to add the date/time element on the server and just store that whole unique ID in the database there.

Categories