Listing files of a directory in a static webpage - javascript

Is there any way to list the files of a directory in a static webpage with the link to view the file?
I would like to upload some PDF files to a certain directory of my static website (it uses HTML and JS), and want to show the list of the files in a page with the link to view the pdf files. That way, if I upload new files, I don't have to modify the HTML page every time. Is there any way to do that?

If you are using Apache as a web-server and have configured mod-autoindex for the directory you upload pdf files then you should probalby see somethink like this when navigation to that url:
This auto-generated page can be easily parsed by js using jquery:
var pdfFilesDirectory = '/uploads/pdf/';
// get auto-generated page
$.ajax({url: pdfFilesDirectory}).then(function(html) {
// create temporary DOM element
var document = $(html);
// find all links ending with .pdf
document.find('a[href$=.pdf]').each(function() {
var pdfName = $(this).text();
var pdfUrl = $(this).attr('href');
// do what you want here
})
});

You need to have a server-side implementation, you could do this by using PHP for example. You cannot do this with JavaScript, because it is run on the client-side, and cannot access the files on the server.

I made a node module to automate the task of getting all files and folders: mddir
Usage
node mddir "../relative/path/"
To install: npm install mddir -g
To generate markdown for current directory: mddir
To generate for any absolute path: mddir /absolute/path
To generate for a relative path: mddir ~/Documents/whatever.
The md file gets generated in your working directory.
Currently ignores node_modules, and .git folders.
Troubleshooting
If you receive the error 'node\r: No such file or directory', the issue is that your operating system uses different line endings and mddir can't parse them without you explicitly setting the line ending style to Unix. This usually affects Windows, but also some versions of Linux. Setting line endings to Unix style has to be performed within the mddir npm global bin folder.
Line endings fix
Get npm bin folder path with:
npm config get prefix
Cd into that folder
brew install dos2unix
dos2unix lib/node_modules/mddir/src/mddir.js
This converts line endings to Unix instead of Dos
Then run as normal with: node mddir "../relative/path/".
I made another node module call agd, to generate a tree view based on the other module: https://github.com/JohnByrneRepo/agd.
Auto generated documentation (Alpha)
Functionality so far:
Generates a tree folder structure in node, that is rendered as a treegrid in the browser. Click on a file (non-root level) to populate main view.
Coming soon:
Generates a documentation guide including function names and parameters, function dependencies, and more. Initially compatible with jQuery and plain JavaScript function namespacing, soon to be compatible with React, Redux, Angular 1, Angular 2 and other frameworks on request.
Usage
node agd relativePath
e.g. node agd '../../'
Generated code.json.
Run 'node http-server' then open the browser to view the file structure rendered in the sidebar. Larger projects can take up to a minute or two to render.
See code.json for example generated data.
To-do: Add code content for top level files. Move tree html generation into node.
Contact html5css3#outlook.com
MIT License
Example generated tree structure

Open a browser
Navigate to the folder you want listed
If you see a list of the files, continue
If you don't see a list of the files, Take a look at this: Is it possible to get a list of files under a directory of a website? How? to figure out how to do that.
Make an ajax call to that folder (example below)
response will be the html of the listing page
You can parse that out to get the file listing
Example:
// This is in angular, but you can use whatever
$http.get('folder_to_list/').success(function(response) {
console.log(response);
});

Instead of JavaScript, which runs only on the client side, you should consider using PHP or other server language, to crawl your directory of files and list them inside an HTML file/template. PHP for example has scandir function, which can list files in a dicrectory

You need a combination of javascript and PHP. You could call the PHP file through Javascript, by using an AJAX call.
try this PHP file which should return an Json object:
$directory = "/directory/path";
$pdffiles = glob($directory . "*.pdf");
$files = array();
foreach($pdffiles as $pdffile)
{
$files[] = "<a href=$pdffile>".basename($pdffile)."</a>";
}
echo json_encode($files);
Now you just need to loop through the Json object to list the Url's.
Something like:
$.getJSON( "file.php", function( data ) {
var items = [];
$.each( data, function(val ) {
items.push(val);
});
$( "body" ).append( items );
});
Did not test it, but something like this should work.

Be simple. Put all your files in a directory and don't make a homepage of that directory. Then, in the page you want, add an Iframe that shows that directory. Then you will see the list of files you uploaded, all in hyperlinks. When you click on the links, the Iframe will show the PDF files.

I have the same problem.
I used to use an apache webserver with its 'fancy directory listings' and had everything setup that way, complete with headers, footers, color schemes, etc.
Then I migrated to gitlab webservers which is pure static pages only. NO Directory listings. Arrggghh...
My solution...
I continue to have the pages served in a local apache server (not world accessable), then I download the "index.html" file it generates, before uploading the index page to gitlab.
Example page generated from apache fancy directory listing...
https://antofthy.gitlab.io/info/www/
I do the same for a set of pages that used Server-Side Includes (shtml), having apache expand the page to static HTML.
Example apache SSI generated page...
https://antofthy.gitlab.io/graphics/polyhedra/
Of course this does not work with pages that rely on executable output, or CGI scripts, but for directory listings it is just fine.
Of course I would prefer to find a SSG that knows apache fancy directory listing, SSI, or even basic directory listings, that isn't over kill. BUt that is an on going search.

Related

How to add dynamic json in a build in react.js?

I have a react app that loads data from a local json file using
import data from './data.json'
it is all working fine without any issues and I'm able to open my index.html in 3000 and also open it as simple HTML page in browser.
Now I run npm run build which creates the build directory. However, my json values become kind of stagnant as it is hardcoded in the javascript in the build. So the question is how can my code in build directory reads json files from a specific location dynamically.
My question: Why not use fetch and serve the JSON from a server side API?
To partially answer your question:
Without changing any webpack configuration, you can use the import() function, instead of import, and a chunk will be built with the json content within a js file.
async function fn() {
const json = await import("./foo.json")
document.title = json.bar
}
On the other hand, probably, webpack has a way to configure this output to be json, but for that you'll need to npm run eject or use a tool to override the webpack production config.
Apart from other alternatives, what you're looking for vanilla Javascript is called fetch API. It's possible to read from either local or remote URLs via fetch method.
As per the example you provided above, instead of doing below;
import data from './data.json'
You can make use of it like;
fetch('./data.json')
Also it works pretty same way as per any URL;
// Sample working URL example to mock some real data
fetch('https://jsonmock.hackerrank.com/api/football_competitions?year=2015')
And best part of it, the parameter fetch method accept can be modified easily since it both accepts local file path and a URL as a variable very same way;
let baseURL = 'https://jsonmock.hackerrank.com/api',
endpointToCall = 'football_competitions',
year = '2015',
URL;
URL = `baseURL/${endpointToCall}?year=${year}`;
fetch(URL);
Note: With the last example above, my point is to destructure the same API endpoint used with previous example before, via dynamic variables in order to being able to more clearer. Please let me know if it's not and you need more clarification.
What you can do it before you run the npm run build you make a request to your server to get the data.json file and then just run the npm run build when it loads. You can write a simple script for it.
For example:
#!/bin/bash
# Get the file from the server
curl https://yourServer/data.zip -o data.zip
# Unzip the file, you can also use unzip
zip -d data.json
# Move the file to the desired directory
mv data.json /yourApp/data/data.json is
# Navigate to the directory where the npm package is
cd /yourApp/
# This one is optional but you should run a test to see if the app won't crash with the new json data that you fetched
# Run tests
npm run tests
# Run the build command for React
npm run build
You can modify this script with your paths and it should work.
Summary
Get the json data with curl
Unzip it
Move it to your react app where data.json is and replace it
Run the tests (optional)
Run the build
You're done.
Hope this helps.

Access directories which look like files on Mac

I am creating a script for Chrome browser to handle files (using File System Access API). This is totally fine on Windows, but on MAC I have this issue:
The files are stored in folders which look like files on MAC. For example folder name is thisisfolder.xyz and inside there are files like file.xml.
thisisfolder.xyz
file.xml
file2.xml
...
If I choose directory handler (handle.getDirectoryHandle(someDirectory)), those folders (thisisfolder.xyz) are greyed out and can't be selected. If I choose file handler (handle.getFileHandle(someDirectories)), I can select thisisfolder.xyz and similar, but later when I want to access files in this folder, I can't, because API thinks those folders are files.
var subdirHandle = await handle.getDirectoryHandle(someDirectory);
for await (var [name, entry] of subdirHandle.entries()) {
...
}
Do I have any possibility here?
This is working as intended. If you try to create a folder named test.txt on a Mac, the Finder even helpfully warns you about the issue you are experiencing:
Are you sure you want to add the extensions ".txt" to the end of the name? If you make this change, your folder may appear as a single file.

Why is eleventy trying to parse a file in a passthrough copy?

I added a "scripts" folder as a passthrough copy to my Eleventy site and installed some npm dependencies inside it for scripts to be run on the page load.
So my .eleventy.js has some lines like this:
eleventyConfig.addPassthroughCopy("img");
eleventyConfig.addPassthroughCopy("scripts");
eleventyConfig.addPassthroughCopy("css");
But when I run npx eleventy, I get a build error that says,
Language does not exist: sh
Problem writing Eleventy templates: (more in DEBUG output)
> Having trouble rendering liquid (and markdown) template ./scripts/wb-service-worker/node_modules/bs-fetch/README.md
Why is it trying to "render liquid" in a passthrough copy? (I thought the whole point of passthrough copies is it doesn't try to parse them.) How do I get it to stop?
addPassthroughCopy will copy the files through without parsing, but if the file is also in the input directory, eleventy will process it in its normal way too.
You should keep the assets that you want to be passthrough-copied in a seperate folder to the src files that you are inputting to eleventy for processing.
See these docs for more help:
Input Directory
Ignoring Files
Passthrough's handling of input files

Is it possible to pick the CSV file automatically from the file directory using Papa parse?

I have the following directory structure for my application:
css
files
js
images
index.html
my csv file is located in the folder "files". I want my js to pick the csv file automatically from the directory and pass that file to Papa parse.
I need to implement this because this is the requirement. I am not allowed to pick the file using input tag of html.
Please let me know if it is possible, then how can implement this.
If not possible then please let me know the another way to do this.
If this is on your local machine, you must set up a simple server which will allow your JS file to read it. This can be done as described below:
Navigate to the folder that contains your CSV files
Within that folder, run the command $ python3 -m http.server
This will allow the files to be available over the link http://lvh.me:8000/your_csv_file.csv
Next, in your JS file using Papa Parse, paste the following:
Papa.parse("http://lvh.me:8000/your_csv_file.csv", {
//parameters to set
download: true,
header: true,
complete: function(results) {
console.log(results);
}
});
On Chrome, you must enable cross-origin resource sharing. You could use the app Allow-Control-Allow-Origin
Using Chrome, navigate to your webpage and look at the console log. The content of your CSV file will be there!

api.add_files is missing a file in Meteor package

I'm adding three files for the client in my Meteor package like so:
api.add_files([
'lib/client/newsletter_banner.html',
'lib/client/newsletter_banner.css',
'lib/client/templates.js'
], ['client']);
newsletter_banner.html defines a template which is not available when I load the site. If I look through the sources in Devtools, I can see that the CSS and JS files are available, but the HTML file is not. Why is this? I've confirmed that the filename is correct and even changed it thinking that name might be unavailable to me for whatever reason, but the file is still not included.
Html files are loaded by the templating package, so you need to add it to your package as well:
api.use(['templating', 'spacebars', 'ui'], 'client');

Categories