Make Mithril application SEO friendly - javascript

Me and my friend have created a website which we want to use as en experiment for school purpose.
https://www.daniellindgren.se/
But we are encountering some problem when we want Google bots to crawl the subpages, like CV and contact.
When we use Google webmaster tool to how the indexing from Google goes, it says that they can't crawl anything else then the startpage.
We have built a sitemap and we have also declared that in the robots.txt.
But we read somewhere that Mithril can cause problem for Google bots because their links to subpages starts with an "?".
Is there any workaround for that we can use or what other solution is there? Should we maybe try to re-make it a single-page application instead?

I don't see any "?" in the links on your site, and in general Google should be able to index SPA:s nowadays.
But it's not always working, so an option could be to use Mithril to render the templates server-side as well. Depending on your backend it may take a little bit of work. If you're using Node.js it's easy with mithril-node-render, if not I recommend Haxe and mithril-hx for cross-platform support.
Then you need to change the routing strategy so a request from outside the application hits the server as well. Unless you think about it from the beginning, you probably need to rewrite quite a bit of the backend to make it more isomorphic.
But your site doesn't have much client-side functionality however, so as it is right now, I'd treat the site as a non-SPA, and use Mithril when you want some dynamic, ajax-driven functionality.

Related

Can I use Google sheets as a small database for my apps?

I have a personal finances spreadsheet on my Google Drive.
For some practice and ease of use, I wanted to make a desktop and mobile app to manipulate it:
add an expense
list my expenses
filter the expenses
etc...
Is there a way to do this?
Since it is something so small, I'm trying to avoid anything that needs a dedicated server that I would need to set up or needing to rent any services because it's just a ease of use project and not a "must-have".
I've been searching a lot but most of the time I get confused with responses from 2-3 years ago.
EDIT:
So I found a way, using this video:
https://youtu.be/3OakodfKjrU
Basically using Google App Script you can make some post and get requests, which for me is a step in the right direction. Just needed to think outside of the box a little and I got a simple way to send post requests to create or update, and get requests to read, for now I also use get to delete but this just seems wrong, might change later.
Also I've searched and started doing a PWA (progressive web app) to be the interface, just so I can make it with a single code base and work with almost any device, using pure js, html and css, the only "non-pure" thing is Onsen Ui for the GUI.
You can use an NPMJS package such as google-spreadsheet to work with the Google APIs and manipulate your spreadsheet; the issue for you with trying to build the app without your own server is the need to securely store your Google Account credentials (you don't want this being publicly accessible).
If you could compromise and use a read-only solution, a package such as tabletop might suffice as you don't need to bother with any back-end work; I doubt this would be ideal for you though. You will need to host the application regardless if you want to be able to access it through the world wide web.
Alternatively, you could run the application locally if you would be happy with only being able to use the app that way. Hope it helps and sorry for the rambling!

Is it possible to edit and save website content to server, making the change viewable by everyone?

I have created a website for a third party, who have no experience in editing HTML. However, the third party wishes to be able to edit the content on the website without having to open the files and edit it this way, they wish to do it somewhat WYSIWYG (For example, hit an "edit" button for the content they wish to edit). Is this possible to achieve? It is not an internal website, it has user tracking (this should obviously only be available to admin users).
Is there a way of making contents of a div editable, then saving the change directly to the server, so the content gets updated publicly?
I am currently researching the topic, and although I have found some indications that the solution may be a PHP script, I have yet to find any definitive solutions or examples of similar functionality.
Yes you will need a backend language or framework to archive this. Where Javascript is used to interact with the page, the actual storage of information requires a database or similar technology.
Unfortunately which backend language or framework to choose really is the million dollar question. It largely depends on exactly what you are trying to accomplish, what your client or user is comfortable with, and how much experience you have programming.
PHP is fast and time tested backend language. Node is the new kid on the block, and it very popular also. Java and dotNet are on the way out. You can dig up a bunch more including Go, Python, Haskel, Etc.
You can use a languge listed above and start scripting away, but this can be time consuming and error prone. Most people use a framework to get started, and program using that framework's tools. The most popular PHP framework is WordPress, but it is designed for blogs and might not fit your use case. I use the framework Craft CMS which is very customizable. But the way you are phrasing the question a framework might be overkill. This is really up to you to decide after doing research into the available options and comparing them to what you wish to accomplish.
For the WYSIWYG, you might want to look into the following tools for the client to edit content:
https://imperavi.com/redactor/
https://ckeditor.com/
Hopefully this provides some direction, happy coding!

Angular/React - Where do they fit in?

I would like to start looking into Angular or React, but I'm having a hard time at the minute figuring out where they fit in?
I currently build all our sites using PHP based Expression Engine or Craft CMS. Is it possible to use Angular or React with these? Would I be correct in thinking they act as the whole front-end?
So for example, would I use EE/Craft to just create the API's to fetch/post data, and then Angular/React would generate the pages using the data from these calls?
That is exactly what I would do. I am not overly familiar with the CMS frameworks you are using, but do have a good bit of experience with CMS development. Typically I leverage the provided APIs to bring the data to the presentation layer, and then use a JavaScript framework such as Angular to create my UI.
This approach will work great if you can get away with not using any of the CMS server controls, and perform all data operations through API calls.
Using a CMS for your data as an API is fine, but you might be better off with something custom made (like a rails or nodejs project). CMS's are not ideal if you're are building just an API. Typically, you'd built your React/Angular website as a static website that you can deploy to a hosting provider (like S3 or Github Pages), this gives the immediate benefit of improved security, speed and scalability. From your static React or Angular site you then leverage your server-api's to fetch the data.
However, this only fixes the data problem, not the content management problem. Since you have a static site, there is no way for your content editor or marketers to be able to change the content on your website. Everything has to be done in the source code - by a developer. You can fix this by adding a drop-in content management solution like INSTANT on top of your static website. Without any coding, it allows anyone with an account to change content directly on the website.

Make AngularJs crawlable for search engines

I am a noob with angular, and making javascript crawlable. I've been searching for it, but I don't really get it so far.
I am working on a AngularJs thingie which is using client-side JSON.
There is a navigation with pages, but... each link is using a function getPage(n) to slice a chunk of JSON and Angular renders it.
Is it OK to put a href="#!page=n" to each link? When I add that hash #! to the url and press enter, and a function renders the right items, is that enough to make it crawlable?
I've read something about snapshots, but it requires Java? I have a webhost which is not really flexible, it does NOT TomCat or NodeJs.
I think it's much better practice these days to use HTML5 history.pushState, and thus provide a unique URL for every page.
More information here.
Check this older stackoverflow question - Making angular crawlable - Beginning of Project
A friend of mine uses - https://prerender.io/
Both these solutions are essentially caching versions of your rendered views, so the crawler can index your site.

Create a plugin to add my website functionalities on another website

I have a website A with a database and a search engine of some object, user can create account on my website then add comment for these objects.
I need to create an api with something like a plugin, it will result on having the seach engine on another website B.
I have planned to do like fb or twitter plugins : the dev who want to use my api will just need to add a line of js, and a line of html on website B, then it will load the plugin. But I'm wondering how to organize it.
Here it what I've guessed : I create a page on my website A, put the search engin on it. I create a js that will load this page whithin an iframe, on the dev's page (website B), under the div he added to have my plugin. Then I implement OAuth 2 (with a provider and so, so people can do post requests to alter my db), and people who is one the website B will have the ability to post comment on the objects of the search engine on website B.
Actually it seems to be the same as fb comment plugin process, but it seem too complicated to do all that stuff. Is it the right way? Can anyone detail the problems that I should face during implementation?
You need to develop a decent API which can return search result in JSON (and XML if you want to please everyone). That already would offer other developer the ability to use your site functionalities, that's mostly back end work tho. So they can develop their own widget.
To develop your own widget as a search widget you don't need that much, you just need either a set target (maybe a required element) or/and an initialization method (more flexibility for the dev) to which you pass a target.
Bind the search button, grab terms search, call your API and when you get your result display them or/and execute a custom callback pass the result as an argument, flexibility)
If you do your javascript well you can create a little API there too which facilitate the usage of your API via javascript. And then even easily port it to a jQuery plugin or something similar.
When serving JSON always remember to set your headers for your API to allow for crossdomain or go for jsonp instead.
Your question implies an architectural direction, but the requirements are too broad for such a choice. I would narrow down your requirements first. OAuth 2.0 could potentially help satisfy your needs, but ask yourself at least the following:
What user data needs to be protected?
What 3rd-party data access is needed? What functionality?
If you go with OAuth 2.0, are you prepared to follow a spec which is still changing? Are you willing to be the authentication provider?
What server-side languages/platforms are acceptable?
What other security considerations are important to you? (Such as social sharing, level of 3rd-party app trust...)
How much are you willing to rely on 3rd-party tools? Or write your own?
I agree that modeling your design off FB or Twitter or Google is not a bad idea, as they have done this sort of thing.
You might have a look at the new book Getting Started with OAuth 2.0.
Here are two simple ways of making custom search available to users.
The simplest option is to do what Google does - the search on your site would follow a simple, well defined API - so that
www.mysite.com/search?q=keyword1+keyword2
returns a list of results in HTML.
Then you'd tell your users to include a snippet of HTML:
<form action="http://www.mysite.com/search" method="get">
<input name="q" type="text" value="Search">
</form>
That would do, though at this juncture you may want to improve things with better search options, a javascript wrapper for the search form, a JSON or XML format for the data returned, security, a better worked out API that takes all these into account.
Another way is to use javascript and provide the data with a callback facility, so the URL:
www.mysite.com/js-search?q=keyword1+keyword2&callback=someHandle
will return a javascript file containing JSON data and a call to "someHandle" when it is loaded. The developer using your API have to write their own way of making the request and the handler. Bear in mind that because of XSS, the queries would probably come from your partners' servers. The simplest is probably to make your own search offer simple and well-documented so others can exploit it.
OAuth 2 could be helpful just if you would allow the website B to make POST request to the website A in background.
Instead if you want allow the users that visit the website B to post a comment then the iframe with a form that point to the website A is enough.
The easier bet, yet not necessarily the wisest, is to create some JS which calls on your website using JSONP.
iFrames are not W3C standard, try avoiding them if possible. Code a Javascript with some events that will do some JSONP calls into your own server and return the results in Javascript accordingly, so it would be able to interact with the page.

Categories