How to programatically create pull request with changed file using GitHub API - javascript

How can I create PR on Github using their API? Let's imagine I have package.json file as a string and I want to make changes to this file. So I parse it, make changes to it and then what exactly I need to do to make it look like I made those changes locally after checking out the new branch, making changes, commiting them to new branch and the pushing them to remote? I see that they have POST API endpoint to create a commit and POST API endpoint to create a PR, but I don't see where I put the file I changed.

Using something like GitJS could be a solution.
This library is a simple wrapper around the git command line.
Instead of using the Github API you could just work the git commands programatically with a library like this.
That way you don't have to work with the much more complicated API and have the benefit of supporting other source control sites too.
Looking at their example we can see it's incredibly easy to commit and push a file using javascript:
require('simple-git')()
.add('./*')
.commit("first commit!")
.addRemote('origin', 'some-repo-url')
.push(['-u', 'origin', 'master'], () => console.log('done'));
Make sure you refer to the usage documentation before trying that example to ensure you configure everything correctly.
Alternatively, you can use the package Octokit to more easily interface with the Github API.
When adding, committing and pushing a file via the API you must first start by creating a tree. Then you use that tree as part of the commit request, update the references and finally push the commit.
You can find a working example on this Github issue.

Related

Using Azure's azcopy tool from within a javascript azure function

My setup is the following: I have 1 storage account with a container and I have another storage account with a different container. What I want to do is have a blob trigger activate whenever someone uploads a file to the first storage account, and have that blob be copied to the second storage account's container. azcopy works well with the command line, but I have not found a way to use it within an azure function. Any help is appreciated, thanks.
For NodeJS, you would just use Child Processes (or a wrapper like execa) to run executables but isn't something I would recommend you do. Also, when running on Azure, you will have to make sure azcopy is present and if you still need to go down this path, Custom Containers would be your best bet.
In the case of Azure Functions if the file just must be copied to a different container, you could just use the Blob Output Binding which would achieve this with almost no code.
For more complex scenarios where the output binding lacks, you could just use the Blob Storage NodeJS SDK directly in your code.

process.env.proxy no longer populated in React.js in Node 8.8.1

I set the proxy in the package.json file like that:
"proxy": "http://localhost:8000/"
and I used to be able to read the value using
process.env.proxy
Unfortunately since I updated to the latest node v8.8.1 the proxy is no longer part of the process.env properties.
Where did it go and how can I get it?
PS: I need the value because I'm creating links in my UI that point to the API backend.
Cheers.
Just require('package.json') and extract it from there.

Can AdonisJs be used for REST APIS?

Sorry for a nooby question. I'd ask it anyway!
I am playing around with AdonisJs. I understand it is a MVC framework. But I want to write REST APIs using the said framework. I could not find much help on the internet.
I have two questions:
Does the framework support writing REST APIs?
If yes to 1. then what could be the best starting point?
1. I've created 3 API projects with AdonisJS and think it's ideal for quick setup. It has many functions already included from start, supports database migrations and is pretty well documented in general.
You can create routes easily with JSON responses:
http://adonisjs.com/docs/3.2/response
Route.get('/', function * (request, response) {
const users = yield User.all()
response.json(users)
})
Or add them to a controller, and even fairly easily add route authentication with token protection (all documented):
Route.post('my_api/v1/authenticate', 'ApiController.authenticate')
Route.group('api', function () {
Route.get('users', 'ApiController.getUsers')
}).prefix('my_api/v1').middleware('auth:api')
2. Take a look at the official tutorial, you can probably finish it in about half an hour. http://adonisjs.com/docs/3.2/overview#_simplest_example
Start with defining some routes and try out echoing simple variables with JSON and just in regular views.
Move the test logic to Controllers
Read a bit more about the database migrations and add some simple models.
Don't forget the Commands and Factory, as you can easily define test data commands there. This will save you a lot of time in the long run.
Just keep in mind that you need to have a server with Node.JS installed to run the system on production (personally I'm keeping it running using a tool like Node Forever JS.
In order to create just a RESTful api you can use
npm i -g #adonisjs/cli
# Create a new Adonis app
adonis new project-name --api-only

sigma.js: can't display graph with json

I have difficulty display graph with json file. I have try to run the example(load-external-json.html) provide by the application but it can't display graph.
I look through the web for solution but haven't find one solution that solve my problem.
the plug-in use: plugins/sigma.parsers.json/sigma.parsers.json.js
sigma.parsers.json("data/arctic.json", // Here I need to put the filepath instead of "data.json"
{
container: document.getElementById('container'),
settings: {
defaultNodeColor: '#00a',
edgeColor: 'default',
defaultEdgeColor: '#a00'
} });
Best Regards,
luis
I believe this plugin makes an XHR request to get the specified file (in this case data/arctic.json). That means you may need a run a server locally in order to see this work properly.
If you have Python installed, an easy way to do this is to run python -m SimpleHTTPServer from the directory where load-external-json.html is located. Then you should be able to see it live at http://localhost:8000/load-external-json.html.
From sigma.js wiki on github:
Sigma provides many code examples to show you what it can do and how to use it. Some of these examples load external data files, and you need to access them through a local server to see the examples working.
you need to run a simple Node based http server. Just follow the instructions on the wiki:
https://github.com/jacomyal/sigma.js/wiki

Watch FTP folder via node.js

I need to watch ftp folder for files create/update/remove events.
But I can't find any suitable solution to do this using node.js.
Can you help me, please?
You need get recursive listing of remote ftp directory and save it (look at https://github.com/mscdex/node-ftp and https://github.com/evanplaice/node-ftpsync)
Set timeout to get new recursive listing
Compare new and old listing (look at https://github.com/andreyvit/json-diff) and call handlers to the corresponding events
Overwrite old listing with new
Return to step two
You can use sftp-watcher modules. which will reduce your time.
https://www.npmjs.com/package/sftp-watcher

Categories