I'm currently working on a Rails project that uses RSpec as the testing framework. We have different Rails fixture data that helps us test our UI when we're in different states.
However, I've been intrigued with using TestCafe to handle our functional UI testing. I've played around with various examples of the website and it seems great to use. But the one part holding me back from introducing it into my project is I'm not sure how to use TestCafe to set up something equivalent to Rails fixture data - i.e. I want to use TestCafe to set up different scenarios where my database is in a particular state so that I can then test the UI.
I've tried searching through TestCafe's documentation but haven't had much luck since TestCafe seems to use the word "fixtures" to represent a particular page you're testing and not the mock data.
Unfortunately there is no built-in support for handling test databases. But because tests are executed in Node.js environment, you can use e.g. db-migrate and js-yaml modules to load your fixtures to the test database.
Related
I am using Qunit and I am newly Qunit to test my internal javascript functions, I want to use it also for some user interactions like testing error message after Jquery validations. Since I dont use HTML pages (I should test them on gsp, which requires a working runtime.) I dont know how to test them. Is it possible to test them on a given host name(like Selenium-tests)?
Is it really necessary to test your interactions against a real host? You could always use mocks and test only if your error messages are being called properly, without actually contacting any external elements. That's the whole point of unit testing, after all. I personally use SinonJS for that kind of thing. You could also check for other mocking frameworks if you like (e.g. JsMockito).
For more information about mocks and stubs, I'd recommend you to read this article and do a little research about it on Google's testing blog.
I have a Visual Studio solution with two projects in it.
First project is a console application written in F#. The console application is actually a simple server with web API implementation.
Second project is a web application written in JavaScript. This project contains corresponding implementation of API, which allows users to make API requests to the server.
I need to test this client-side implementation of API to be sure that it passes the right data to the server and receives the right data back. What is the best practice to achieve this?
The problem is that I need to build and run the first project before I can run tests in a second project. What is also would be good is to make these tests running on Continuous Integration. If it is matter, I use visualstudio.com for CI.
Is this possible at all without manual start of first project and then running tests?
Note that on each test run it is important to restart console application.
Yes it is possible. I would recommend you to have a look at RAML which is a way of
Documenting your API
Using the documentation to generate a client and use it for automation tests
Using Anypoint you can expose your documentation and get your stakeholders to easily test it as well.
Please have a look at Design, test and document RESTful APIs using RAML in .NET for a full run through and explanation. In this blog post I used .NET at my language of choice, but the same principle applies to other languages.
With this you can test your API before you write the client. The Anypoint platform also allows you to mock the webservice and you can use the mock for testing you client later on.
This way you can test both applications in isolation and in a programmatic/automated manner (which you can execute from your CI). RAML has also a library for generating a JavaScript client from a RAML file: https://github.com/mulesoft/raml-client-generator
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I am currently looking into alternative platforms to migrate an existing application onto, it started out as a prototype using asp.mvc but the majority of code is javascript with a simple asp mvc web service so as we are looking at taking it forward it seems sensible that we just scrap the current microsoft stack and just go for nodejs giving us more freedom of where and how we host our application, also we can reuse some of the models and code in both the web service and front end, although this would probably end up being a small amount.
This will probably be quite a large question encompassing many parts but I will put it out there anyway as I am sure this could be helpful to lots of other people looking at how they can move from something like .net/java to to node.js. As most of these statically typed languages have lots of patterns and practices used, such as Inversion of Control, Unit of Work, Aspect Oriented Programming etc it seems a bit strange moving towards another platform which doesn't seem to require as much structure in this area... So I have some concerns about migrating from my super structured and tested world to this new seemingly unstructured and dynamic world.
So here are the main things I would do in MVC and would want to do in node.js currently but am not quite sure the best way to achieve the same level of separation or functionality.
Routing to Actions
This mechanism in ASP MVC seems to be replaceable by Express in node.js, which will give me the same ability to map a route to a method. However there are a couple of concerns:
in ASP MVC my controllers can be dependency injected and have variables so actions are easy to test as everything they depend on can be mocked when needed and passed in via constructor. However as the method in express seems to not have a containing scope it seems like I would have to either use global variables or new up the variables internally. Is there a nice way to access my business logic containers in these routed methods?
Is there any way to autobind the model being sent? or at least get the json/xml etc in some useful manner? It appears that if you send over the correct mime type with the content it can be extracted but I have not seen any clear examples of this online. I am happy to use additional frameworks on top of Express to provide this functionality, as ideally I just want to make a POST request to /user/1 and have the user object pulled out and update the user with id 1 in the database (which we will talk about shortly).
What is the best way to validate the data being sent over? Currently in our front end javascript application we use KnockoutJS and all models use knockout observables and are validated using Knockout.Validation. I am happy to use Knockout Validation on node as the models are the contract between the front end and back end, however if there is a better solution I am happy to go look into it.
Database Interactions
Currently in .net land we use NHibernate for communicating with our relational databases and the MongoDB driver for communicating with our MongoDB Databases. We use a Generic Repository pattern and isolate queries into their own classes. We also use a Unit Of Work pattern quite heavily so we can wrap logical chunks of word into a transaction and then either commit it all if it goes well or roll back if it doesn't. This gives us the ability to mock out our objects at almost any level depending on what we want to test and also lets us change our implementations easily. So here is my concern:
Sequalize seems to be a good fit for replacing NHibernate, however it doesn't seem to have any sort of transaction handling making it very difficult to make a unit of work pattern. This is not the end of the world if it cannot be done in the same way, but I would like some way of being able to group a chunk of work in some way, so an action like CreateNewUserUnitOfWork which would take a model representing the users details, validate it, create an entry in one table, then create some relational data in others etc, get the users id from the database and then send that back (assuming all went well). From looking at the QueryChainer it seems like it provides most of the functionality but if it failed on the 3rd of 5 actions it doesn't appear simple to roll back, so is there some way to get this level of control?
Plugins / Scattered Configuration Data
This is more of a niche concern of our application, but we have the central application then other dlls which contain plugins. There are dropped into the bin folder and will then be hooked in to the routing, database and validation configuration. Imagine it like having the google homepage, and google maps, docs etc were all plugins which told the main application to route additional calls to methods within the plugin, which then had their own models and database configurations etc. Here is my concern around this:
There seems to be a way to update the routing by just scanning a directory for new plugins and including them (node.js require all files in a folder?) but is there some best practice around doing this sort of thing as I don't want each requestto have to constantly do directory scans. It is safe to assume that I am happy to just have the plugins in their right places at the time of starting the node application so no need to add plugins at runtime at the moment.
Testing
Currently there is Unit, Integration, Acceptance testing within the application. The Unit tests happen on both the front end and back end, so currently it will do javascript tests using JsTestDriver in our build script to confirm all the business logic works as intended in isolation etc. Then we have integration tests which currently are all done in C# which will test our controllers and actions work as expected as well as any units of work etc, this again is kicked off by the build script but can also be run via the Resharper unit test runner. Then finally we have acceptance tests which are written in c# using web driver which just target the front end and test all the functionality via the browser. My main concerns around this are:
What is the best practice and frameworks for testing nodejs? Currently most of the tests which test at the ASP layer are carried out via C# scripts creating the controller with mocked dependencies and running the actions to prove it works as intended using MVC Helpers etc. However I was wondering what level of support NodeJS has around this, as it doesn't seem simple (on a quick glance) to test node.js components without having node running.
Assuming there is some good way to test node.js can this be hooked into build scripts via command line runners etc? as we will want everything automated and isolated wherever possible.
I appreciate that this is more like 5+ smaller questions rolled up into one bigger question but as the underlying question is how to achieve good design with a large node js application I was hoping this would still be useful to a lot of people who like myself come from larger enterprise applications to node js style apps.
I recently recently switched from ASP MVC to Node.js and highly recommend switching.
I can't give you all the information you're looking for, but I highly recommend Sequelize as an ORM and mocha (with expect.js and sinon) for your tests. Sequelize is adding transaction support in the next version, 1.7.0. I don't like QueryChainer since each of it's elements are executed separately.
I got frustrated with Jasmine as a test framework, which is missing an AfterAll method for shutting down the Express server after acceptance tests. Mocha is developed by the author of Express.
If you want to have an Express server load in plugins, you could use a singleton pattern like this: https://github.com/JeyDotC/articles/blob/master/EXPRESS%20WITH%20SEQUELIZE.md
You will probably will also really like https://github.com/visionmedia/express-resource which provides a RESTful interface for accessing your models. For validation, I think you'll be happy with https://github.com/ctavan/express-validator
Why would you need to test Node modules without using Node? It's considered standard to call a test script with a Makefile, and you can add a pre-commit hook to git to run the tests before making a commit.
I want to do a smoke test in order to test the connection between my web app and the server itself. Does Someone know how to do it? In addition I want to do an acceptance tests to test my whole application. Which tool do you recommend?
My technology stack is: backbone and require.js and jquery mobile and jasmine for BDD test.
Regards
When doing BDD you should always mock the collaborators. The tests should run quickly and not depend on any external resources such as servers, APIs, databases etc.
The way you would want to make in f.e. Jasmine is to declare a spy that pretends to be the server. You then move on to defining what would be the response of the spy in a particular scenario or example.
This is the best aproach if you want your application to be environment undependent. Which is very needed when running Jenkins jobs - building a whole infrastructure around the job would be hard to reproduce.
Make spy/mock objects that represent the server and in your specs define how the external sources behave - this way you can focus on what behavior your application delivers under specified circumstances.
This isn't a complete answer, but one tool we've been using for our very similar stack is mockJSON. It's a jQuery plugin that does a nice job both:
intercepting calls to a URL and instead sending back mock data and
making it easy to generate random mock data based on templates.
The best part is that it's entirely client side, so you don't need to set up anything external to get decent tests. It won't test the actual network connection to your server, but it can do a very good job validating that type of data your server would be kicking back. FWIW, we use Mocha as our test framework and haven't had any trouble getting this to integrate with our BDD work.
The original mockJSON repo is still pretty good, though it hasn't been updated in a little while. My colleagues and I have been trying to keep it going with patches and features in my own fork.
I found a blog post where the author explain how to use capybara, cucumber and selenium outside a rails application and therefore can be use to test a javascript app. Here are the link: http://testerstories.com/?p=48
We've just started developing a web app with AngularJS and we're having some problems with testing it properly so we could use some advice.
Generally, there are the following components to test:
Web API
Angular Controllers
Angular routing
HTML rendering and Angular binding of Controllers to the HTML elements
How does one test all of this with minimal effort and no, if possible, overlap?
For any database-centric application, complete integration testing (i.e. with a live server, connected to a database loaded with data) would be particularly messy because there would have to be a process that generates sufficient data for all tests and resets the DB and tests would have to be careful not to modify each other's data. (if I'm missing something here please let me know)
Given the above point, I'm assuming it is best to sever the link between server and client and run the Angular tests using mock data only.
Also, I'm assuming that if E2E testing takes care of all possible scenarios, unit testing controllers is redundant as their values are bound to the model (and would thus be testing all of 2, 3 and 4 above). Unit testing would only be helpful in very complex controllers or to test services and directives.
However, we could not find any information on how to mock stuff with $httpBackend on a per-test basis like you would do in unit tests, using expect*(). Angular docs seem to suggest using when*() plus the occasional passthrough() when necessary.
But, this poses the aforementioned problem of creating test data for all scenarios and you'd probably need to reset the in-memory DB before each test to be sure the tests are not affected. Also, you're losing the safety of using $httpBackEnd.expect*() which checks that there are no missing or redundant calls to the server - This would suggest to me that it would also require unit testing controllers to check this.
Can someone provide a detailed testing strategy for AngularJS apps that addresses the testing of the 4 components above as well as the concerns written above?
Not sure - since your angular app is theoretically decoupled from your backend, there's no particular reason that angular tests and backend tests need to be commingled. I would test them separately, with each test suite assuming that the other component works fine. (So when testing angular, you assume that the server will work as expected.)
Unit tests - they provide more depth than E2E tests. It's easier to verify specific conditions the code will face. It's also easy to mock out all dependencies as necessary and test just the component the unit tests is interested in. Unit tests don't concern themselves with how the UI works, or that the right data is bound correctly, but rather than the business logic of the app is correct.
(and 4) E2E tests - less granularity, focusing on making sure the UI looks as expected from an end user perspective. You are right that it's messy to test against a live database (although some people enjoy the safety provided by a full end-to-end integration test), and so you should use $httpBackend to mock out the server.
Refer this https://www.sitepoint.com/unit-and-e2e-testing-in-angularjs/.
If your service/factory uses the http service to call a remote API you can return fake data from it for unit testing.If you are starting a new Angular project consider using Protractor for E2E tests.