Front end javascript testing using Require and Resharper - javascript

So I've been trying to figure out how front end testing works (unit testing) but I am getting stuck on some point.
So I have my jasmine test set up as follows:
describe('Blabla', function () {
it('returns true', function () {
var people = require(["people"], function(ppl) {
return ppl;
});
expect(people.getTitle()).toBe('People piolmjage');
});
});
But running this gets me:
TypeError: undefined is not a funtion
So obviously, people is undefined. So perhaps my callback comes in too late. But if I remove the callback I get following error:
it('returns true', function () {
var people = require("people");
expect(people.getTitle()).toBe('People piolmjage');
});
Error: Module name "people" has not been loaded yet for context: _. Use require([])
I figure there is something wrong in my setup...Anyone have any idea how to get this FE testing to work?
I did manage to get it to work from console and using define combined with phantomjs and the durandal test files but I need this to work outside of the console and hereby I cannot use this define because the test runner won't find my tests.
That's why I need to use the CommonJS was of getting the required viewmodels.
people model
define([],
function () {
var getTitle = function() {
return "hello";
}
var peopleViewModel = {
title: 'People page',
getTitle: getTitle
};
return peopleViewModel;
});
UPDATE
I got the code working but not with resharper. Following this page from the durandal webpage.
But this gets me console output which is way to unstructured to actually read through.
I can however use the define keyword and then it works fine. So I assume it is the require keyword where I mess up something?
UPDATE 2
So I used fiddler to check what is going on. I also finally got it working (kinda...).
My testfile looks like this now:
///<reference path="../../Scripts/require.js"/>
///<reference path="../../test/lib/jasmine-2.1.3/jasmine.js"/>
///<reference path="../../App/viewmodels/people.js"/>
describe('Blabla', function () {
it('require test', function (done) {
require(['people'], function (people) {
expect(people.title).toBe('People page');
done();
});
});
});
And then I changed my people file:
define("people", ["bla"], function (bla) {
return {
title: 'People page',
bla: bla
};
});
As you can see here, I name my viewmodel to be people.
This works for the testrunner but he doesn't actually get any files through requireJS but only the reference paths. Also this does not fit my needs because the durandal models are unnamed.
Fiddler screenshot:
So basically he does not use requireJS to get the viewmodels and therefor I cannot just use the require.config initializer to get to my viewmodels folder and download every viewmodel using requireJS. Any thoughts?

I finally got it working, took me like a day and a half.
Anyway I don't use resharper anymore, or it's test runner to be more precise.
Chutzpah is the one I turned to in the end. This too took me some research but I got it to the point where it includes everything as I want it to.
Check this post for sure
Here is what I did:
My people.js looks like this:
define(['viewmodels/bla'], function (bla) {
return {
title: 'People page',
bla: bla //testing dependencies on other viewmodels
};
});
Then I also made a bla.js
define(function() {
return {
bla: "bla"
};
});
And now for the tests:
describe('Blabla', function () {
it('require test', function (done) {
require(['viewmodels/people'], function (people) {
expect(people.title).toBe('People page');
done();
});
});
it('dependency on require test', function (done) {
require(['viewmodels/people'], function (people) {
console.log(people.bla);
expect(people.bla.bla).toBe('bla');
done();
});
});
});
And then eventually, reading the answers on the link provided on top I had to create a Chutzpah config file to create a test harnass:
{
"Framework": "jasmine",
"TestHarnessReferenceMode": "AMD",
"TestHarnessLocationMode": "SettingsFileAdjacent",
"References" : [
{"Path" : "../Scripts/require.js" },
{"Path" : "requireConfig.js" }
],
"Tests" : [
{"Path": "specs"}
]
}
Now, running the tests with Visual studio test runner actually gets me everything I need and as you can see, I can now access all my viewmodels through require like so: require(['viewmodels/whateverviewmodel'], function(whateverviewmodel){....})
I hope this answer can get people on their way to testing your (Durandal)SPA using Jasmine and RequireJS.
I know my viewmodels in this answer, nor in the question itself, say much but this should get you an idea of how to go about all of this.
Small Edit
You can now also skip the callback mess with require([]... inside of the tests and build your tests like you do your viewmodels with define
define(['viewmodels/people'], function (people) {
describe('Blabla', function () {
it('require test', function () {
expect(people.title).toBe('People page');
});
it('dependency on require test', function () {
console.log(people.bla);
expect(people.bla.bla).toBe('bla');
});
});
});
This gets you less indents and is more readable in itself.

The require call provided by RequireJS is inherently asynchronous so you need to do something like this:
it('returns true', function (done) {
require(["people"], function(people) {
expect(people.getTitle()).toBe('People piolmjage');
done(); // Signal that the test is done.
});
});
The first attempt you show in your question cannot work. That's the classical "trying to return values synchronously form asynchronous code" mistake. The second attempt with require("people") does not work either because this require call is pseudo-synchronous and will work only if the module requested is already loaded. See this answer for an explanation of how this pseudo-synchronous require works.

Related

"ReferenceError: Can't find variable: Mustache" in Unit Tests?

im running a rails app which im running Unit tests on the Javascript side (Using Teaspoon/Jasmine).
the funny thing is, on the function I call I KNOW Mustache.render function is working (Im able to console.log it's return value (which is the Mustache.render function) and see that it is working. However when I call that function from my unit tests im getting a:
Failure/Error: ReferenceError: Can't find variable: Mustache.
For reference I don't actually call the Mustache render function directly im simply calling the function that uses it and grabbing it's return value to check again.
I've been able to successfully grab and use various other functions and use them just fine, this one is just giving me trouble. Can the Mustache.render object not exist outside it's own file or scope or something?
Edit: Example code:
_makeSomething: function viewMakeSomething(data) {
const template = templates.something;
return Mustache.render(document.querySelector(template).innerHTML, data);
}
and my test code is simply:
it('_makeSomething object', function() {
let obj = {id: 1234, content: "_makeSomething Assertion", author: "Test User"}
let $something = _makeSomething(obj);
});
(Right now im just capturing it before I assert anything or split it up/etc...., but it's just calling it at all)
Problem is that you teaspoon doesn't have access to your dev/production assets pipelene. You should specify what JS to load for your tests. This is necessary to prevent loading all files from manifest to test some feature. Because this is unit testing.
From example:
//= require mustache
describe("My great feature", function() {
it("will change the world", function() {
expect(true).toBe(true);
expect(Mustache).toBeDefined();
});
});

getText is not a function error - Protractor (javascript)

I have node.js installed and protractor installed. I have experience with selenium-webdriver but Protractor is driving me nuts!!! I am also not that familiar with javascript.
This is what my code looks like:
describe('My app', function() {
var result = element(by.id('result-name'));
var enterBtn = element(by.id('enter'));
var clearFieldBtn = element(by.id('clear-field');
it('should bring up components on load', function() {
browser.get(`http://localhost:${process.env.PORT}`);
browser.wait(until.titleContains('Sample App'), 500);
browser.wait(until.presenceOf(browser.element(by.id('my-test-app'))), 500);
expect(enterBtn).isPresent;
});
it('result should equal username', function () {
browser.get(`http://localhost:${process.env.PORT}`);
expect(clearFieldBtn).isPresent;
expect(result.getText()).toEqual('John Smith'); //both tests pass without this line of code
});
});
The last line "expect(result.getText()).toEqual('John Smith');" throws me an error. I get:
expect(...).toEqual is not a function
Any help would be much appreciated. I have spent a couple of hours trying to find a solution and trying different things.
I also wanted to implement the isPresent function how it's done in the api docs which is like this: expect($('.item').isPresent()).toBeTruthy();
I tried to do:
expect(clearFieldBtn).isPresent().toBeTruthy();
But I get that isPresent is not a function...
The expect above that line seems poor. It should read
expect(clearFieldBtn.isPresent()).toBeTruthy();
not sure if that is causing the weird error on the line below...just thought I would throw it out there. All your protractor APIs need be be called within the expect because isPresent is not a attribute of expect
Have you tried these lines:
clearFieldBtn.isPresent().then(function(bln) {
expect(bln).toBe(true);
});
result.getText().then(function(tmpText) {
expect(tmpText).toBe('John Smith');
});
If you still get an error on result.getText(), please check the presence of the result object.

How to implement a plugin, that modifies the original module only when required?

I have a plugin extending an original module.
It should only modify the module, when explicitly required.
Problem:
As soon as it is required once, the original module is modified forever, also for cases where the plugin is not a dependency.
The order doesn't matter here, it's enough to require the plugin once.
Example:
define("main", [], function() {
return {opt: "A"};
});
define("plugin", ["main"], function(obj) {
obj.opt = "B";
});
require(["main", "plugin"], function(obj) {
console.log(obj.opt); // should log B
});
require(["main"], function(obj) {
console.log(obj.opt); // should log A but logs B
});
I guess the way to go is to somehow tell require to always reload main from source instead of using the cached version.
I have no idea how, though.
Or maybe there's an even more elegant way?
Please enlighten me, guys.
Fiddle: http://jsfiddle.net/r75e446f
UPDATE: Some might find it important to know that I need this for my karma unit test environment to test a module with and without the plugin.
UPDATE2: Look below for my own solution.
RequireJS modules are singletons. If you load main once, twice, 10 times, you are always going to get the same module. And so if you modify its state, it is modified for all modules that use it. It is possible to tell RequireJS to undefine the module but I do not recommend it as it will just make your code obscure.
If I wanted to do what you are trying to do I'd design my code something like this:
<script>
define("main", [], function() {
function Main (opt) {
this.opt = opt;
}
return Main;
});
define("plugin1", [], function() {
return {
install: function (main) {
main.opt += " plugin1";
}
};
});
define("plugin2", [], function() {
return {
install: function (main) {
main.opt += " plugin2";
}
};
});
// Case 1: no plugins
require(["main"], function(Main) {
var main = new Main("A");
console.log(main.opt);
});
// Case 2: only plugin1
require(["plugin1", "main"], function (plugin1, Main) {
var main = new Main("A");
plugin1.install(main);
console.log(main.opt);
});
// Case 3: only plugin2
require(["plugin2", "main"], function (plugin2, Main) {
var main = new Main("A");
plugin2.install(main);
console.log(main.opt);
});
// Case 4: plugin1 and plugin2
require(["plugin1", "plugin2", "main"], function (plugin1, plugin2,
Main) {
var main = new Main("A");
plugin1.install(main);
plugin2.install(main);
console.log(main.opt);
});
Basically, make what is common to all cases a Main class which can be initialized at construction and which can be modified by plugins. Then each plugin can install itself on Main. The code above is a minimal illustration of how it could be done. In a real project, the final solution would have to be designed to take into account the specific needs of the project.
If you don't want the original module to be modified for all modules which use it, then your plugin should not modify the original module. Instead, have the plugin return a modified copy of the original module instead.
define("main", [], function() {
return {opt: "A"};
});
define("plugin", ["main"], function(obj) {
var decorator = {}
for (var key in obj) { decorator[key] = obj[key];}
decorator.opt = "B";
return decorator
});
require(["main", "plugin"], function(obj, plugin) {
console.log(plugin.opt); // should log B
});
require(["main"], function(obj) {
console.log(obj.opt); // should log A but logs B
});
This will work without any complications if your original object is a simple struct-like object without any functions. If there are functions, or if your original object was constructed using the Module Pattern, then there is a strong possibility of subtle errors in the copy, depending on how the methods were defined.
EDIT 2015-01-13: The OP clarified his question that he would like a way for his tests to be able to run both modified and unmodified original module without having to reload the page. In that case, I would recommend using require.undef to unload the main module and then reload it without having to reload the entire page.
I would like to bring up my understanding for discussion.. when we AMD, we define the module. Unless the module (in this case is main, the defined source) is able to accepting the plugin and allow the plugin to change the properties, i guess we're not able to hijack it or it will be an anti-pattern?
what about we extend/clone the defined source, like
define("plugin", ["main"], function(obj) {
return $.extend(true, {}, obj, {opt: "B"});
});
http://jsfiddle.net/r75e446f/6/
here the plugin module is always using the main module as its dependency, then when using the module we just directly use the plugin module and no longer need to require the main module? (by the way i understand this defeat the definition of 'plugin', but my direction is that we're dealing with the module as what the requireJs designed to be)
--2nd approach--
if you do really want to do it the plugin way, how about the loader plugin? like the text plugin we always using `require('text!abc.json'). So write a loader and register it to the requireJs's config, and then we could use it. (more)
So I found out how to achieve, what I want and thought I'd share it here.
The answer is called 'context', which is an option in the requirejs config.
http://requirejs.org/docs/api.html#multiversion
Here's how I implemented my solution:
var reqOne = require.config({
context: 'one'
});
var reqTwo = require.config({
context: 'two'
});
reqOne(["main", "plugin"], function(obj) {
console.log(obj.opt); // logs B
});
reqTwo(["main"], function(obj) {
console.log(obj.opt); // logs A
});
Unfortunately this doesn't work in the fiddle, because the second require will try to load the main module externally and I can't upload files to jsfiddle.
But the mere fact, that he tries to do that and doesn't use the 'main' module already loaded in the other require, should be proof that his method works.
If the definitions are outsourced to single files this works like a charm.
Thanks for everybody who chimed in.
A more elegant solution would be to take an object-oriented approach, where:
the modules return constructors rather than instances, and
plugin can then be a subclass of main.
This is the implementation (fiddle):
console.clear();
define("Main", [], function() {
return function() { this.opt = "A"; };
});
define("Plugin", ["Main"], function(Main) {
return function() {
Main.call(this);
this.opt = "B";
};
});
require(["Plugin"], function(Plugin) {
console.log((new Plugin()).opt); // should log B
});
require(["Main"], function(Main) {
console.log((new Main()).opt); // should log A
});

Make real async requests for jasmine integration testing

I've got an angular app going with some jasmine testing. I recently added a new method to make a query to elastic search within one of my services, it looks like this.
test:
function(){
return "Working";
},
executeSearch:
function(field, value, size, page_number){
return service.executeRegExSearch(field, value, size, page_number);
},
//new method
executeRegExSearch:
function(field, value, size, page_number){
//main search body, I know it works because
//I am getting expected results in the browser
}
And then in my jasmine tests, I've got something like this.
//initialization stuff
var $httpBackend;
var searchAPI;
beforeEach(inject(function($injector){
jasmine.DEFAULT_TIMEOUT_INTERVAL = 5000;
$httpBackend = $injector.get('$httpBackend');
searchAPI = $injector.get('searchAPI');
}));
it("is loaded properly", function(){
expect(searchAPI.test() == "Working").toBe(true); //passes
});
it("can make a request", function(){
var field = "col_name";
var value = "bb.*"; //this is matching in my browser/application
var size = 10;
var page_number = 1;
var res;
searchAPI.executeSearch(field, value, size, page_number).then(function(res){
res = res;
alert(JSON.stringify(res));
done();
});
$httpBackend.flush();
});
But when I run, I get the error
Unexpected request: GET http://myserver/index-1/_search?source={"query":{"regexp":{"col_name":{"value":"bb.*"}}}}"&size=10&from=0 No more request expected in http://localhost:8081/js/angularjs/angular-mocks.js (line 1180)
I'm not sure about how to use the mocks, or am even aware that I was doing it. All I want to do is be able to run a suite of tests that make actual calls to my backend to confirm that things are being integrated properly... you know.. integration tests.
Anyone have any advice?
I didn't want to do tests with Selenium stuff, so to test my app with real backend calls I used a modified version of angular-mocks
It works just like for unit-tests in Jasmine.
I'm using it with Jasmine 2.0, so a test looks like following :
it(' myTest', function (done) {
_myService.apiCall()
.then(function () {
expect(true).toBeTruthy();
done()
});
});
NB: the done is needed because of the async call.
If you want to do end to end e2e test for angular applications you definitely must use protractor (was created by the angular team), you could create your tests using jasmine and run them against your application, if you are just using unit testing I think the best way is to mock the httpBackend.
https://github.com/angular/protractor
https://docs.angularjs.org/guide/e2e-testing

RequireJS - using module returns undefined

I'm trying to use RequireJS in my app. I'm including the requirejs script from cdnjs like this:
<script src="//cdnjs.cloudflare.com/ajax/libs/require.js/2.1.10/require.min.js"></script>
on my page I have a button and I register an event for it:
$('#btnSpeedTest').on('click', function (e) {
require([baseUrl + 'Content/js/tools/speedtest.js'], function (speedTestModule) {
alert(speedTestModule);
});
});
If I watch with Fidler - I see that upon clicking the button speedtest.js is loaded.
speedtest.js contains the following:
define('speedTestModule', function () {
function SpeedTest(settings, startNow) {
// basic initialization
}
var fn = SpeedTest.prototype;
fn.startRequest = function (download, twoRequests) {
// logic
}
return SpeedTest;
});
The alert(speedTestModule); command returns "undefined". I saw a tutorial on RequireJS and in that tutorial everything was in the same directory as well as files with names of modules (which is not my case since I'm loading it from CDN).
I even tried to return a simple string, but it did not work. What am I missing?
Thanks
Don't use a named define. Instead of this:
define('speedTestModule', function () {
do this:
define(function () {
and let RequireJS name your module. You typically want to let r.js add names to your modules when you optimize them. There are a few cases where using names yourself in a define call is warranted but these are really special cases.

Categories