Is this possible?
To know data changed by other users in realtime without requery or refresh on MS Access.
I'm developing user forms in HTML & Javascript and using MS Access as back-end DB.
Three or four users always keep opening the form.
I want to refresh and display other user's changes into the form in real-time, like SQLserver's SqlNotificationRequest or Ajax with php.
I allowed only using MS Access and HTML with JS on intranet, due to an authority.
Is there no way but using timer function with refresh or requery in JS?
You can't do it in real time; you will have to fake it. Decide what is an acceptable lag in the information update (5 seconds? 30 seconds?) and set up a timer on your front end.
When the data is modified on your database is there logging/audit? Do you keep a timestamp? If yes, you can use that to check for new changes. If not, just create a single record, single field table to store the last modification timestamp. Or if you have a generic parameter or global values stable, add one more record there. Make sure anything on your front end that alters data updates this timestamp field.
Then your front end's timer function can check the last update timestamp and compare it with its own last update timestamp (which you stored locally on the previous timer event) and see if it needs to refresh the data or not.
Related
I am developing a simple script for an inventory tracking spreadsheet that sends an email alert listing low stock items with current quantities. Everything seems to be working well...except, it works TOO well.
I'm depending on Google's native "On Edit" trigger to initiate the script. Unfortunately that means that when I update the inventory on 10 items (which I often do), the script runs 10 times and sends 10 email notifications.
In an effort to consolidate the 10 emails into one, I used Utilities.sleep(300000) to insert a delay before the script runs, hoping that it would then only run once and produce one email with all the edits made during those five minutes.
As I'm sure any expert would already know, I ended up with the same 10 emails, just delayed by 5 minutes. It seems Google runs the script in parallel with respect to the number of triggers.
How can I turn my spamming script into a consolidated email notification that is only sent once, when all edits have been made?
This is my first coding project. I am a coding novice and know only what I have learned through StackOverflow and Google. This question may have been answered somewhere and I just don't know what to search to find it. Thanks in advance for any help/guidance!
first, look at enter link description here, I think your best bet would be to use some form of update(recipient, subject, body) to set up the emails to update the same one email constantly then put the "On Edit" below it to start the timer that would send the email so it would constantly update, but only send after an edit and a delay
Issue:
An onEdit fires every single time the script is edited. If you don't want to send an email every time there's an edit, I don't think it's a good option to use an onEdit trigger.
Workaround #1. Use a button or a custom menu:
I'd suggest you to fire this function via a button or a custom menu instead, so that you actively decide when to send an email.
Workaround #2. Time-driven trigger:
If you don't want to manually fire the script, another option would be to install a time-based trigger instead (see the available triggers here), which would fire periodically (could be hourly, daily, every X minutes, etc.), and would check whether there has been any change to the low stock items since last execution, and send an email if that's the case.
In order to check whether there has been any change, I'd consider using PropertiesService to store these low stock items every time an email is sent, and compare those properties to current data every time the function is executed.
In broad terms, the script called by the time-driven trigger should do this:
function sendEmailIfChanges() {
// Get low items from spreadsheet
// Retrieve stored properties from previous execution
// Compare current low stock items with current execution
// If there is change between both lists of items (item IDs, or stock quantities)
// send an email and store current low stock items via PropertiesService
}
Reference:
Custom Menus in Google Workspace
Time-driven triggers
PropertiesService
This may sound a little unprecise, but is it possible to change the TYPO3 session variable
$GLOBALS["TSFE"]->fe_user->setKey('ses', 't_minus', 0);
from javascript somehow for the variable to be processed within a listAction to specify certain records to be shown or not?
In more detail: I work on an extension for a calendar. The calendar is generated in javascript (that's the part i'm not responsible for). The listAction basically generates a JSON object or records which have a unix timestamp within a specific interval (2 weeks in the past, 3 weeks in the future). The javascript has a button "one week forward"/"one week back". One week back should subtract 604800 (1 week) from the session variable, one week forward add 604800 (1 week) to the session variable.
In the listAction the session variable adjusts the timestamp interval (n weeks forward/backward) or rather which records should be put into the JSON object.
Is it possible to access the session variable from javascript or does this violate safety requirements?
It is not possible to change TYPO3 session content directly via javascript. It may be that you can access the PHP session cookie via javascript - I am not quite sure on that -, but the session variables are stored in the TYPO3 DB. They are serialized and encrypted via PHP and you won't be able to get access on these via Javascript. This is only possible via PHP.
What you could do: create oneWeekForwardAction and oneWeekBackAction in your controller. Read your session cookie and modifie it to your needs. These actions can be activated via Ajax. For this to happen you have to create the appropriate links with the f:uri.action in advance in your Fluid template! Place these strings somewhere in your JSON object. Then you can connect these links to the click events.
Don't forget to generate a special AJAX page type with page.config.disableAllHeaderCode. If you search for this and Ajax you will find examples, f.e. this one:
[http://www.sklein-medien.de/tutorials/detail/erstellung-einer-typo3-extension-mit-ajax-aufruf/]
It is from 2016 und uses Extbase/Fluid.
I would create a PHP function to change this session variable (e.g. with eID functionality) and call the function via AJAX in JavaScript.
More see here: https://seethroughweb.com/ajax-with-typo3/
(Sorry, couldn't find a better manual in english, you need to use the new class names)
It's not a question of safety. The idea of a session is, to bind to the same data on the server over a series of multiple HTTP requests. Javascript lives in the Browser. There is no direct access from JavaScript to the data on the server.
If you are responsible for JS only, it's the job of the other person to provide an interface to the session data for you.
And yes, I think it a good idea to synchronise your calculations in the Browser with the data in the server else calendar datas get quickly out of sync. I would even go that far, that the server should be responsible for this kind of calculations, as it is more reliable than JS. So the guy responsible for PHP should do the main job and provide results to you i.e. via AJAX.
Some of you might argue that this is a question for programmers.stackexchange.com but, having read through Help Center of Stack Overflow I believe this is a specific programming problem and I am more likely to get a response here.
I have a webapp that uses ExpressJS with Neo4j database as backend. I have a search screen, where I would like to use the power of Neo4j's relationships. The search screen accepts one or more values (i.e. manufacture year, fuel type, gearbox, etc etc) and then make a post request to ExpressJS where I construct a cypher query using the parameters of POST request as shown below as an example:
MATCH
(v:VEHICLE),(v)-[:VGEARBOX_IS]->(:VGBOX{type:'MANUAL'}),
(v)-[:VCONDITION_IS]->(:VCONDITION{condition:'USED'})
WITH DISTINCT v
WHERE v.manufacture_year = 1990
MATCH (v)-[r]->(info)
RETURN v AS vehicle, COLLECT({type:type(r), data:info}) AS details
Let's say that running above query, returns the following three vehicles and its properties
If more than 20 vehicles in result then I want to paginate the result and I know how that works, we make use of SKIP and LIMIT as shown below:
MATCH
(v:VEHICLE)
OPTIONAL MATCH (v)-[r:VFUEL_TYPE|:VGEARBOX_IS|:VHAVING_COLOR|...]->(info)
RETURN
v.vehicle_id AS vehicle_id,
v.engine_size AS engine_size,
v.published_date AS published_date,
COUNT(v) AS count,
COLLECT({type:type(r), data:info}) as data
ORDER BY v.published_date DESC
SKIP 20
LIMIT 16
Here is the workflow,
User navigates to search screen which is a form with POST method and various input fields.
User selects some options based on which he/she wish to search.
User then submits the form, which makes a post request to the server.
This request is handled by a ROUTE which uses the parameters of the request to construct a cypher query shown above. It runs the cypher against the Neo4j database and receive the result.
Let's assume, there are 200 vehicles that match the search result. I then want to display only 20 of those results and provide a next/previous button.
When user is done seeing the first 20, he/she wants to see the next 20, that is when I have to re-run the same query that user submitted initially but, with SKIP value of 20 (SKIP value keeps incrementing 20 as user navigates to next page, and decrement 20 as your moves to previous page).
My question is, what is the best approach to save search request (or the cypher generated by original request) so that when user clicks next/previous page, I re-run the original search cypher query with different SKIP value? I don't want to make a fresh POST request every time the user goes to next/previous page. This problem can be resolved in the following manner but, not sure which is more performance-friendly?
Every time the user clicks next or previous page, I make a new POST request with preserved values of original request and rebuild the cypher query (POSTs can be costly - I want to avoid this, please suggest why this is better option)
I store the original cypher query in Redis and whenever the user clicks next or previous, I retrieve the query specific to that user (need to handle this either via cookie, session or some sort of hidden uuid) from Redis, supply the new value for SKIP and re-run it (somehow I have to handle when I should delete this entry from Redis - deletion should happen when user change his search or abandon the page/site).
I store the query in session (user does not have to be logged in) or some other temporary storage (than Redis) that provide fast access (not sure if that is safe and efficient)
I am sure somebody came across this issue and it in an efficient manner which is why I post the question here. Please advise how I can best resolve this problem.
As far as performance goes, the first thing that you should absolutely do is to use Cypher parameters. This is a way to separate your query string from your dynamic data. This has the advantage that you are guarded against injection attacks, but it also is more performance because if your query string doesn't change, Neo4j can cache a query plan for your query and use it over and over again. With parameters your first query would look like this:
MATCH
(v:VEHICLE),(v)-[:VGEARBOX_IS]->(:VGBOX{type: {vgearbox_type}}),
(v)-[:VCONDITION_IS]->(:VCONDITION{condition: {vcondition}})
WITH DISTINCT v
WHERE v.manufacture_year = {manufacture_year}
MATCH (v)-[r]->(info)
RETURN v AS vehicle, COLLECT({type:type(r), data:info}) AS details
SKIP ({page} - 1) * {per_page}
LIMIT {per_page}
Your javascript library for Neo4j should allow you to pass down a separate object. Here is what the object would look like represented in json:
{
"vgearbox_type": "MANUAL",
"vcondition": "USED",
"manufacture_year": 1990,
"page": 1,
"per_page": 20
}
I don't really see much of a problem with making a fresh query to the database from Node each time. You should benchmark how long it actually takes to see if it really is a problem.
If it is something that you want to address with caching, it depends on your server setup. If the Node app and the DB are on the same machine or very close to each other, probably it's not important. Otherwise you could use redis to cache based on a key which is a composite of the values that you are querying for. If you are thinking of caching on a per-user basis, you could even use the browser's local storage, but are users often re-visiting the same pages over and over? How static is your data and does the data vary from user to user?
I created a table that receives data from a SQL database to a PHP script that parse this back though my AJAX to the HTML page.
With this information I create the table.
It works and the beauty of it: every time the data changes the table changes.
BUT: it reloads the whole table with new data.
What I want is to have it only reload the part that's been updates and then "mark" it until you mouse over it.
Is there any function in JS that allows to compare 2 JSON encoded strings and then only update the part that's not similar?
I have use jQuery but haven't found anything as of yet.I apologies for not showing any code but it's protected from sharing
You have to poll AJAX request to the server after every few seconds or minutes and see if there's any update. If so, you receive that update with say the id or index number of the data which you can replace with the new one. That way you won't have to update the entire thing.
I have a requirement for a multi-part form which I want to apply some clever submission logic. Here's the idea, the form has 3 steps:
Personal Contact Details
Quote Details
Final Comments
As any good marketer I don't want to lose any data in the event that the user does not complete ALL the steps this (somewhat long) form.
As a result, what I would like to do is to have the form submit as each step is completed. So that in the event the user drops off we still capture the details on the completed steps.
Ideally I don't actually want to have the form submit 3 time as, if it was going to a simple email script, we'd get 3 results through for each 'complete' submission.
So I'm trying to find some clever way to store the data and submit it after a certain period of time or something along those lines.
I intend to be building this in HTML & JavaScrip (& if need be in PHP). Can anyone suggest the best route to achieve this (from past experience etc) before I get my feet wet!!!
Thanks for your time & any suggestions
The best way to achieve this is to have three separate forms, one for each page. Upon the submission of each form make a post() request to a PHP page on the server using jQuery, containing the serialized() form data. This PHP page then stores the contents of the form in a database for retrieval later.
If the ajax request is successful, show the next page of the form, otherwise display an error telling the user what happened.
Further reading on .post() and .serialize()
You need server support to store survey's temporary result. When user submits next part, you will simple append new answers to the query. The trick is in detecting abandoned queries, but I think if the survey will not be completed in 24h, you can safely assume that user closed browser and will not append any future data.
You must implement persistance on server, SQL database is the best option for PHP - millions of examples.
If I understand your question correctly then you are trying to have the behavior of Wizard in the same page, in that case you can use have three forms .
After completion of one section do an ajax call and save the filled in data in some temp database table, finally when user completes the form you can collate the temp table data and persist in your main table.
In case user doesn't complete all the steps then you can clean up your temp table after certain period of time or you can move it in some 'not-complete' table just in case you want to do some BI over the data.
I would serialize the response and store it in a database {id, stage1 data,stage2 data, last_entry_timestamp}.
Assuming that validation is done at each stage before storing the data,
Stage 1 I would check if an entry exists, and if not create a new entry, and store the serialized stage1 info and set timestamp, else retrieve stage 1 info. (back/forward)
Stage 2 If not set, I would update the created entry with the serialized stage2 info and set timestamp, otherwise retrieve and than update.
Stage 3 I would retrieve stage 1 and stage 2 info, and submit. I would then delete that entry.
Finally I would setup a cron job to look at all entries that are over X hours old, submit them, and delete the entry.