I am trying to make a web interface which display the serial output from my Arduino using Pyserial. I am using Ajax ($.getJSON) to update my HTML string.
The problem I have now is that every time I request my JSON data, it also initialise my ser = serial.Serial('/dev/cu.wchusbserialfa140',9600), which makes the query slow and prohibit real-time update of serial outputs.
my code are as following:
I am trying my best to only execute serial.Serial() once.
#app.before_request
def before_request():
g.status = False
#app.route('/')
def template():
return render_template('index.html')
#app.route('/result')
def serial_monitor():
#connect to serial port for once
if g.status == False:
ser = serial.Serial('/dev/cu.wchusbserialfa140',9600)
g.status = True
result = str(ser.readline())
voltage = {'value':result}
else:
result = str(ser.readline())
voltage = {'value':result}
return jsonify(voltage)
My javascript:
I am using setInterval to repeat it automatically.
$.getJSON($SCRIPT_ROOT + '/result', function(data)
{$('#voltage').text(data.value);});
I have been trying to learn to make my little web interface and Stackoverflow has been great help to me. I have searched and tried hard to solve this problem, but I think it is worth reaching out now.
Thank you all in advance !!
Edited:
I have hacked it a bit to make it do what I want to do for now.
However, I am planning to use a form to get the port value from user before running serial.Serial line. I am still looking at session/global variable route.
global ser
ser = serial.Serial('port',9600)
#app.route('/')
def template():
return render_template('index.html')
#app.route('/result')
def serial_monitor():
result = str(ser.readline())
voltage = {'value':result}
return jsonify(voltage)
The following was the solution I found.
By setting the global variable status correctly (inside the function), I can now run any code for once only.
#app.route('/')
def template():
return render_template('index.html')
status = False
#app.route('/result')
def serial_monitor():
global status
#connect to serial port for once
if status== False:
ser = serial.Serial('/dev/cu.wchusbserialfa140',9600)
status = True
result = str(ser.readline())
voltage = {'value':result}
return jsonify(voltage)
Maybe keep ser as a global variable (although this can be a problem if you use multiple process based workers) so you don't have to open it every time, just seek or whatever is required to get into the correct state (I know nothing about serial so that may or may not make sense). Or maybe voltage can also be global and constantly updated in a background thread so that the serial_monitor function only has to read the latest value of a variable.
Related
I have a submit button for upload images and information. When the submit button is called the following happens at the same time:
POST request to the Flask app. (for updating info)
JS:(at same time as above)
GET request to the Flask app for presignedPost.
Once presignedPostRequest received upload file to s3.
What happens is that the POST request to Flask finishes and tells the page to reload. When this happens the AJAX request gets cancelled (if its still in process). Sometimes my code works and other times it doesnt. By adding a time.sleep(3) to the Flask app I can wait for the s3 upload to finish and everything works. This is not a good solution.
How can I force Flask to wait until the JS function is complete?
I'm trying to save my server by having users send directly to s3. Should be faster for them.
Waiting for 3sec works. Looking at xhr logs in chrome tells what is happening.
preventDefault() doesnt work because there are 2 requests happening.
#users.route("/account", methods=['GET', 'POST'])
#login_required
def account():
form = UpdateAccountForm()
if form.validate_on_submit():
if form.picture.data:
# picture_file = save_picture(form.picture.data)
# current_user.image_file = picture_file
time.sleep(3)
current_user.username = form.username.data
current_user.email = form.email.data
db.session.commit()
flash('Your account has been updated!', 'success')
return redirect(url_for('users.account'))
elif request.method == 'GET':
form.username.data = current_user.username
form.email.data = current_user.email
image_file = url_for('static', filename='profile_pics/' + current_user.image_file)
return render_template('account.html', title='Account',
image_file=image_file, form=form)
ctrl + K is not working so here's a short JS version. ctrl + K is going to my url bar in Chrome. :-(
function uploadfile() {
**Get PresignedPostRequest**
**Upload file to s3**
}
document.getElementById("submit").onclick = function() {uploadfile()
};
I know why this is working this way but I don't know of a reasonable solution. Do I have to change my design pattern? I'm using flask because I'm weaker on JS.
Just graduated a Bootcamp so I'm pretty new to this.
I could run everything through my app but it would be harder on my server...
I think I could use socket.io but it's another layer of complication....
Thanks for looking!
I changed my code to a different approach. Then I returned a few days later to get help from a friend and my code worked fine. Here is what I used to solve the problem. Idk why this didn't work for two days...
document.getElementById("accountForm").onsubmit = function (e) {
e.preventDefault();
uploadfile();
document.getElementById('accountForm').submit();
I created a Web Crawler in Python 3.7 that pulls different info and stores them into 4 different arrays. I have now come across an issue that I am not sure how to fix. I want to use the data from those four arrays in my site and place them into a table made from JS and HTML/CSS. How do I go about accessing the info from my Python file in my JavaScript file? I tried searching in other places before creating an account, and came across some things that talk of using Json, but I am not too familiar with these and would appreciate some help if that is the way to do it. I will post my code below which I have stored in the same directory as my other sites files. Thanks in advance!
from requests import get
from bs4 import BeautifulSoup
from flask import Flask
app = Flask(__name__)
#app.route("/")
def main():
# lists to store data
names = []
gp = []
collectionScore = []
arenaRank = []
url = 'https://swgoh.gg/g/21284/gid-1-800-druidia/'
response = get(url)
soup = BeautifulSoup(response.content, 'html.parser')
# username of the guild members:
for users in soup.findAll('strong'):
if users.text.strip().encode("utf-8") != '':
if users.text.strip().encode("utf-8") == '\xe9\x82\x93\xe6\xb5\xb7':
names.append('Deniz')
else:
names.append(users.text.strip().encode("utf-8"))
if users.text.strip().encode("utf-8") == 'Note':
names.remove('Note')
if users.text.strip().encode("utf-8") == 'GP':
names.remove('GP')
if users.text.strip().encode("utf-8") == 'CS':
names.remove('CS')
print(names)
# GP of the guild members:
for galacticPower in soup.find_all('td', class_='text-center'):
gp.append(galacticPower.text.strip().encode("utf-8"))
totLen = len(gp)
i = 0
finGP = []
while i < totLen:
finGP.append(gp[i])
i += 4
print(finGP)
# CS of the guild members:
j = 1
while j < totLen:
collectionScore.append(gp[j])
j += 4
print(collectionScore)
# Arena rank of guild member:
k = 2
while k < totLen:
arenaRank.append(gp[k])
k += 4
print(arenaRank)
if __name__ == "__main__":
app.run()
TLDR: I want to use the four lists - finGP, names, collectionScore, and arenaRank in a JavaScript or HTML file. How do I go about doing this?
Ok, this will be somewhat long but I'm going to try breaking it down into simple steps. The goal of this answer is to:
Have you get a basic webpage being generated and served from python.
Insert the results of your script as javascript into the page.
Do some basic rendering with the data.
What this answer is not:
An in-depth javascript and python tutorial. We don't want to overload you with too many concepts at one time. You should eventually learn about databases and caching, but that's further down the road.
Ok, here's what I want you to do first. Read and implement this tutorial up until the "Creating a Signup Page" section. That starts to get into dealing with Mysql, which isn't something you need to worry about right now.
Next, you need to execute your scraping script when a request for the server. When you get the results back, you output those into the html page template inside a script tag that looks like:
<script>
const data = [];
console.log(data);
</script>
Inside the brackets in data = [] use json.dumps (https://docs.python.org/2/library/json.html) to format your Python array data as json. Json is actually a subset of javascript, so you just output it as a raw javascript string here and it gets loaded into the webpage via the script tag.
The console.log statement in the script tag will show the data in the dev tools in your browser.
For now, lets pause here. Get all of this working first (probably a few hours to a day's work). Getting into doing html rendering with javascript is a different topic and again, I don't want to overload you with too much information right now.
Leave comments on this answer if you need extra help.
I have a website built using the django framework that takes in an input csv folder to do some data processing. I would like to use a html text box as a console log to let the users know that the data processing is underway. The data processing is done using a python function. It is possible for me to change/add text inputs into the text box at certain intervals using my python function?
Sorry if i am not specific enough with my question, still learning how to use these tools!
Edit - Thanks for all the help though, but I am still quite new at this and there are lots of things that I do not really understand. Here is an example of my python function, not sure if it helps
def query_result(request, job_id):
info_dict = request.session['info_dict']
machines = lt.trace_machine(inputFile.LOT.tolist())
return render(request, 'tools/result.html', {'dict': json.dumps(info_dict),
'job_id': job_id})
Actually my main objective is to let the user know that the data processing has started and that the site is working. I was thinking maybe I could display an output log in a html textbox to achieve this purpose.
No cannot do that because you already at server side therefor you cannot touch anything in html page.
You can have 2 ways to do that:
You can make a interval function to call to server and ask the progress and update the progress like you want at callback function.
You can open a socket connection in your server & browser to instantly update.
While it is impossible for the server (Django) to directly update the client (browser), you can you JavaScript to make the request, and Django can return a StreamingHttpResponse. As each part of the response is received, you can update the textbox using JavaScript.
Here is a sample with pseudo code
def process_csv_request(request):
csv_file = get_csv_file(requests)
return StreamingHttpResponse(process_file(csv_file))
def process_file(csv_file):
for row in csv_file:
yield progress
actual_processing(row)
return "Done"
Alternatively you could write the process to the db or some cache, and call an API that returns the progress repeatedly from the frontend
You can achieve this with websockets using Django Channels.
Here's a sample consumer:
class Consumer(WebsocketConsumer):
def connect(self):
self.group_name = self.scope['user']
print(self.group_name) # use this for debugging not sure what the scope returns
# Join group
async_to_sync(self.channel_layer.group_add)(
self.group_name,
self.channel_name
)
self.accept()
def disconnect(self, close_code):
# Leave group
async_to_sync(self.channel_layer.group_discard)(
self.group_name,
self.channel_name
)
def update_html(self, event):
status = event['status']
# Send message to WebSocket
self.send(text_data=json.dumps({
'status': status
}))
Running through the Channels 2.0 tutorial you will learn that by putting some javascript on your page, each time it loads it will connect you to a websocket consumer. On connect() the consumer adds the user to a group. This group name is used by your csv processing function to send a message to the browser of any users connected to that group (in this case just one user) and update the html on your page.
def send_update(channel_layer, group_name, message):
async_to_sync(channel_layer.group_send)(
group_name,
{
'type': 'update_html',
'status': message
}
)
def process_csv(file):
channel_layer = get_channel_layer()
group_name = get_user_name() # function to get same group name as in connect()
with open(file) as f:
reader=csv.reader(f)
send_update(channel_layer, group_name, 'Opened file')
for row in reader:
send_update(channel_layer, group_name, 'Processing Row#: %s' % row)
You would include javascript on your page as outlined in the Channels documentation then have an extra onmessage function fro updating the html:
var WebSocket = new ReconnectiongWebSocket(...);
WebSocket.onmessage = function(e) {
var data = JSON.parse(e.data);
$('#htmlToReplace').html(data['status']);
}
I currently have a javascript variable called myVariableToSend that contains a single string and I need to send to my views where I can make raw SQL queries to gather corresponding data from the database and bring it back to my javascript. Here is what I have:
Javascript:
function scriptFunction(myVariableToSend){
$.getJSON("http://127.0.0.1:8000/getData/", myVariableToSend, function(serverdata){
window.alert(serverdata);
});
Views.py:
def getData(request):
some_data = request.GET(myVariableToSend)
cursor = connection.cursor()
cursor.execute("SELECT Car_ID FROM cars WHERE Carname = %s ", [some_data])
row = cursor.fetchall()
return JsonResponse(row, safe = False)
Urls.py:
url(r'^admin/', include(admin.site.urls)),
url(r'^$', startpage),
url(r'^getData/$', getData ),
I don't think my server side script(views.py) is working because when I run my server, I get a http500 error. Any help would be appreciated. Thank you.
UPDATE:
I have found that when I comment out my entire Views.py and only put
def getData(request):
return JsonResponse({"hello":"World"}, safe = False)
I get no problems and the AJAX request works. But when I have my original getData, it doesn't work. When I add this line in my views.py:
some_data = request.GET(myVariableToSend)
, I get an error and the data isn't displayed
If u want to send ur variables to function in view, u can capture it with url, like this:
$.getJSON('http://127.0.0.1:8000/getData/' + myVariableToSend +'/', function (serverdata) { //do ur work}
Then in urls.py you have:
url(r'getData/(?P<my_var>\w+)/$', views.get_data, name='get_data')
Then views.py:
def get_data(request, my_var):
#do ur work here
Answering the original question:
Your server is failing probably because bad syntax in views.py
some_data = request.GET(myVariableToSend)
myVariableToSend is undefined here. So you should get it like this:
some_data = request.GET['myVariableToSend']
Besides the original question:
You'll get a lot of headaches if you try to set up your django app like this.You can query your database way easier if you use django's ORM. Read about it here.
Also, if you want to send the data in your models to your javascript code, you can save yourself lots of time by using a framework like Django REST Framework.
In my Django project that uses Celery (among a number of other things), I have a Celery task that will upload a file to a database in the background. I use polling to keep track of the upload progress and display a progress bar for uploading. Here are some snippets that detail the upload process:
views.py:
from .tasks import upload_task
...
upload_task.delay(datapoints, user, description) # datapoints is a list of dictionaries, user and description are simple strings
tasks.py:
from taskman.celery import app, DBTask # taskman is the name of the Django app that has celery.py
from celery import task, current_task
#task(base=DBTask)
def upload_task(datapoints, user, description):
from utils.db.databaseinserter import insertIntoDatabase
for count in insertIntoDatabase(datapoints, user, description):
percent_completion = int(100 * (float(count) / float(len(datapoints))))
current_task.update_state(state='PROGRESS', meta={'percent':percent_completion})
databaseinserter.py:
def insertIntoDatabase(datapoints, user, description):
# iterate through the datapoints and upload them one by one
# at the end of an iteration, yield the number of datapoints completed so far
The uploading code all works well, and the progress bar also works properly. However, I'm not sure how to send a Django message that tells the user that the upload is complete (or, in the event of an error, send a Django message informing the user of the error). When the upload begins, I do this in views.py:
from django.contrib import messages
...
messages.info(request, "Upload is in progress")
And I want to do something like this when an upload is successful:
messages.info(request, "Upload successful!")
I can't do that in views.py since the Celery task is fire and forget. Is there a way to do this in celery.py? In my DBTask class in celery.py I have on_success and on_failure defined, so would I be able to send Django messages from there?
Also, while my polling technically works, it's not currently ideal. The way the polling works currently is that it will endlessly check for a task regardless of whether one is in progress or not. It quickly floods the server console logs and I can imagine has a negative impact on performance overall. I'm pretty new to writing polling code so I'm not entirely sure of best practices and such as well as how to only poll when I need to. What is the best way to deal with the constant polling and the clogging of the server logs? Below is my code for polling.
views.py:
def poll_state(request):
data = 'Failure'
if request.is_ajax():
if 'task_id' in request.POST.keys() and request.POST['task_id']:
task_id = request.POST['task_id']
task = AsyncResult(task_id)
data = task.result or task.state
if data == 'SUCCESS' or data == 'FAILURE': # not sure what to do here; what I want is to exit the function early if the current task is already completed
return HttpResponse({}, content_type='application/json')
else:
data ='No task_id in the request'
logger.info('No task_id in the request')
else:
data = 'Not an ajax request'
logger.info('Not an ajax request')
json_data = json.dumps(data)
return HttpResponse(json_data, content_type='application/json')
And the corresponding jQuery code:
{% if task_id %}
jQuery(document).ready(function() {
var PollState = function(task_id) {
jQuery.ajax({
url: "poll_state",
type: "POST",
data: "task_id=" + task_id,
}).done(function(task) {
if (task.percent) {
jQuery('.bar').css({'width': task.percent + '%'});
jQuery('.bar').html(task.percent + '%');
}
else {
jQuery('.status').html(task);
};
PollState(task_id);
});
}
PollState('{{ task_id }}');
})
{% endif %}
(These last two snippets come largely from previous StackOverflow questions on Django+Celery progress bars.)
The simplest answer to reduce logging and overhead is to put a timeout on your next PollState call. The way your function is written right now it immediately polls again. Something simple like:
setTimeout(function () { PollState(task_id); }, 5000);
This will drastically reduce your logging issue and overhead.
Regarding your Django messaging question you'd need to pull those completed tasks out with some sort of processing. One way to do it is a Notification model—or similar—which you can then add a piece of middleware to fetch unread notifications and inject them into the messages framework.
Thanks to Josh K for the tip on using setTimeout. Unfortunately I could never figure out the middleware approach, so instead I'm going with a simpler approach of sending an HttpResponse in poll_state like so:
if data == "SUCCESS":
return HttpResponse(json.dumps({"message":"Upload successful!", "state":"SUCCESS"}, content_type='application/json'))
elif data == "FAILURE":
return HttpResponse(json.dumps({"message":"Error in upload", "state":"FAILURE"}, content_type='application/json'))
The intent is to simply render a success or error message based on the JSON received. There are new problems now but those are for a different question.