How to upload content via Github API to hidden folder name? - javascript

I am trying to upload a Github action workflow via the Github V3 API.
I am trying to do the following to upload the main.yaml file to .github/workflows/main.yaml:
await this.put(`https://api.github.com/repos/${this.ownerName}/${this.repoName}/contents/.github/workflows/main.yaml`, {
message: title,
content,
sha,
branch: newBranch
})
It seems as though including the .github in the file path of the URL returns a 404. Is it possible to upload to a hidden directory? Maybe I need to escape the . somehow?

It might be that used personal access token (PAT) has insufficient permissions.
Slightly unintuitive, but to upload workflow files, the PAT needs to have a "Workflow" scope enabled too, not only have a write access to repo.
It is at tokens and looks like:

Related

Building a flask + heroku app, s3 hosting for static assets (js and images currently not loading)

I'm building a flask app called Neuroethics_Behavioral_Task.
I also have an s3 bucket called neuroethics-task. In the root directory, I uploaded a file called experiment.js and an image, test.png.
I followed the instructions in these two parts of Heroku's documentation about s3:
https://devcenter.heroku.com/articles/s3
https://devcenter.heroku.com/articles/s3-upload-python
The first link says the following about how to access assets you've uploaded to s3
After assets are uploaded, you can refer to their public URLs (such as http://s3.amazonaws.com/bucketname/filename) in your application’s code. These files will now be served directly from S3, freeing up your application to serve only dynamic requests.
So I have this line in the header of one of the html templates.
<script href="http://s3.amazonaws.com/neuroethics-task/experiment.js"></script>
I also tried to include the image from copying the path directly on s3 (which is different from the heroku docs). Here's that line.
<img src='s3://neuroethics-task/test.png'
The issue is that nothing happens when I access the web page that's supposed to use the javascript from experiment.js currently when I'm trying to use the Flask application LOCALLY.
I suspect that maybe things will work if I push to heroku... But I need to get a local debugged solution up and running first and foremost. So i need to figure out how to correctly reference these files.
I had gotten error messages before when I used src= and when I had variants of the url's prefix. But now, nothing happens when I get to the webpage that's supposed to load experiment.js. experiment.js uses a javascript framework called JsPsych that basically works like a static application -- no redirects occur from jspsych. You have to create an html template for flask's sake, but all you have to do for that template is include the reference to the experiment.js file.
Since experiment.js just isn't loading yet, and since there's no other html on that template because all of it is within experiment.js, nothing happens.
I have my environmental variables set:
$ export AWS_ACCESS_KEY_ID=jhdfshfjskdhfj
$ export AWS_SECRET_ACCESS_KEY=jlsjfklksfjlfh
I'm not sure about what permissions settings I need on s3. For my bucket, I have
Block public access to buckets and objects granted through new access control lists (ACLs) -- Off
Block public access to buckets and objects granted through any access control lists (ACLs)-- Off
Block public access to buckets and objects granted through new public bucket or access point policies -- On
Block public and cross-account access to buckets and objects through any public bucket or access point policies -- On
So... what's going wrong here? I just want my javascript to load at least.

Javascript - Loading Images from Dropbox returns 403 Forbidden Error

I am attempting to load a series of Images from a Shared Dropbox Folder like so:
function getSprite(raw) {
var sprt = new Image();
sprt.crossOrigin = '';
sprt.src = 'https://dl.dropboxusercontent.com/s/k1v7iv85vntx107/AABOD-CfE3A5sQo0RPPmRmmJa/ground1.png' + (raw ? '?raw=1' : '');
return sprt;
}
The folder is shared, and Dropbox says that 'People with Link can View'. I have tried to do the same with Google Drive, but I get a Cross Origin Error there.
EDIT: I just tried to share one of the files individually, and it worked. DO I have to now go through and do this for each file in the folder? I thought If I just share the folder I should have access to all its contents.
ERROR MESSAGE:
GET https://dl.dropboxusercontent.com/s/k1v7iv85vntx107/AABOD-CfE3A5sQo0RPPmRmmJa/characters/triggerman/up.png?raw=1 403 (Forbidden)
It looks like the original shared link you had was:
https://www.dropbox.com/sh/k1v7iv85vntx107/AABOD-CfE3A5sQo0RPPmRmmJa?dl=0
This is a shared link for a folder. Note that you can't just modify it directly to get shared links for individual files inside that folder though, which is what you appear to be trying in your question.
To get the individual files, you have a few options:
Manually get the shared links for each file via the Dropbox web site, as you mentioned.
Use the API to individually but programmatically get shared links for each file: https://www.dropbox.com/developers/documentation/http/documentation#sharing-create_shared_link_with_settings
Use the API to download the files in the original shared link by specifying the path inside the folder: https://www.dropbox.com/developers/documentation/http/documentation#sharing-get_shared_link_file This is likely closest to what you're looking for.
I don't think this has much to do with JavaScript. Go Incognito and take a look at it because all I can see is a 403 error from my browser.

I want to upload a file by ajax but I don't want the server direct link appears

Everyone know when upload a file by ajax we must add the direct link to the php file which will upload the file, ex: url: site.com/upload.php.
What I want is that this link site.com/upload.php will not be the direct link to upload the file, but it will be just redirection to the up.php file which will be upload the file actually.
The aim is I want to disappear the link which responsible for upload the client file.
Is this idea can be implemented ?
IMHO If your target with that is security, You're targeting the wrong phase.
Instead of protecting the PHP File location, you can protect the file itself.
the UP.PHP can have some "filename" protection using mod rewrite (How to remove .php or .html extension from single page?)
The UP.PHP can have authentication and authorization through your site auth/autho system, and if you want it public, do it keeping FORM ID pair or something.
The address of the UP.PHP will look something like: site.com/upload/receiver and will point to your file, if you want to protect even this address, you should use anything else but JS Ajax, like flash, java, or anything else.
Just follow any how to do it step-by-step.
https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#safe=off&q=flash+file+uploader+example
Use SSL encryption...

How to add custom password reset template for Parse app hosted on Parse?

I have a web application that I am hosting on Parse with a subdomain "appname".parseapp.com url (The quotes are not actually there, and that's not the actual link to my app). Supposedly, I am able to use my own templates for things like the password reset form, however, I haven't had any success. I downloaded the template, modified it, and put it in my public directory then deployed it. I set the Parse Frame URL to the "appname".parseapp.com/user_management.html like it says after also putting the user_management.html file in my public directory, then I set the directory of password reset file in the Customize User-Facing Pages section as choose_password.html since it is right in the public directory. The link sent to the email that attempts to reset the password somehow keeps being wrong and gives me a 404. I'll get a link like this: "appname".parseapp.com/user_management.html?link=%2Fapps%2Fschool-project%2Frequest_password_reset&token=TvIoEhOD8ZsWAP414jBCbY3OI&username=testuser
Any Idea why this isn't working correctly?
Figured out my mistake. I was supposed to include the entire link for the template not just the directory after the domain. e.g. "appname".parseapp.com/choose_password.html rather than just /choose_password.html

Javascript React Single Page Application + Amazon S3: Permalinks issue

I am using an S3 bucket as static web solution to host a single page React application.
The react application works fine when I hit the root domain s3-bucket.amazon.com and the HTML5 history api works fine every time I click on a link the new url looks fine: _http://s3-bucket.amazon.com/entities/entity_id_
The problem happens when I use permalinks to access the application. Let's assume I am typing the same url (_http://s3-bucket.amazon.com/entities/entity_id_) in the browser I will get the following error from Amazon S3:
404 Not Found
Code: NoSuchKey
Message: The specified key does not exist.
Key: users
RequestId: 3A4B65F49D07B42C
HostId: j2KCK4tvw9DVQzOHaViZPwT7+piVFfVT3lRs2pr2knhdKag7mROEfqPRLUHGeD/TZoFJBI4/maA=
Is it possible to make Amazon S3 to work nicely with permalinks and HTML5 history api? Maybe it can act as proxy?
Thank you
Solution using AWS CloudFront:
Step 1: Go to CloudFront
Click on distribution id
Go to the Error page tab
Click on Create Custom Error Response
Step 2: Fill the form as
HTTP Error Code: 404
TTL: 0
Custom Error Response: Yes
Response Page Path: /index.html
HTTP Response Code: 200
source: https://medium.com/#nikhilkapoor17/deployment-of-spa-without-location-strategy-61a190a11dfc
Sadly S3 does not support the wildcard routing required for single page apps (basically you want everything after the host in the url to serve index.html, but preserve the route.
So www.blah.com/hello/world would actually serve www.blah.com/index.html but pass the route to the single page app.
The good news is you can do this with a CDN such as Fastly (Varnish) in front of S3. Setting up a rule such as:
if(req.url.ext == "" ) {
set req.url = "/index.html";
}
This will redirect all non asset requests (anything without a file extension) to index.html on the domain.
I have no experience running SPA on Amazon S3, but this seems to be a problem of url-rewriting.
It is one thing do have the history (html5) api rewrite your url when running your application / site.
But allowing rewritten urls to be accessible when refreshing or cold-surfing to your site definitely needs url-rewriting on a server level.
I'm thinking web.confg (IIS), .htaccess (Apache) or nginx site configuration.
It seems the same question already got asked some time ago: https://serverfault.com/questions/650704/url-rewriting-in-amazon-s3
Specify the same name for the index and error files on "Static website hosting" properties. See linked image 1.
Old question but simplest way would be to use hash routing. So instead of mydomain.com/somepage/subpage it would be mydomain.com/#/somepage/subpage

Categories