Im building the back-end for an app using node and express.
I separated different parts of code in diferent files: for example everything that concerns accessing the database is in the file DBService.js and if I want to perform any action related to my users I have a UserService.js file that does all the app needs with the users, and uses DBService.js to save the users in the DB.
I'm aware I do have some circular dependencies in my code but all worked fine until now. I'm using GraphQL for pretty much everything but I'm adding a normal endpoint to grab a file given it's ID.
I do require the FileService.js in the index.js (entry point to the node app) to serve the file, and this part works good. The problem is that in another file (ZoneService.js) where I also require the FileService.js, it returns an empty object.
I know for a fact that this is the problem because if I remove the require in the index.js file, the problem disappears.
These are the paths that lead to the circular dependencies. The '->' means that the previous Service requires the next.
FileService -> ZoneService -> FileService
FileService -> ZoneService -> FileUploadService -> FileService
It may look silly but I need this because I thought it was a good move to keep the graphQL type definitions and resolvers of each entity in it's own file.
I will try to explain my reasoning for the first path:
I want to grab files that are from a certain zone so this function goes into FileService. I then use ZoneService to get the file ID's given the zone ID, then I get the paths from the DB
ZoneService needs the FileService to resolve the 'files' field in the zone entity
I could just move this function to ZoneService and get the files from there, but would kinda of break all of my logic of separating concerns.
What I would like to know is the best way of fixing this so that it does not happen again, and how can it be avoided.
I would post some code but I'm not sure what so if you think it's necessary let me know.
Thanks in advance!
Edit - Here is some code:
FileService.js
//Import services to use in resolvers
const EditService = require("./EditService.js")
const ZoneService = require("./ZoneService.js")
//Resolvers
const resolvers = {
Query: {
getFileById: (parent, {_id}) => {
return getFileById(_id)
},
getFilesById: (parent, {ids}) => {
return getFilesById(ids)
},
getFilesByZoneId: (parent, {_id}) => {
return getFilesByZoneId(_id)
},
},
File: {
editHistory: file => {
return EditService.getEditsById(file.editHistory)
},
fileName: file => {
return file.path.split('\\').pop().split('/').pop();
},
zone: file => {
return ZoneService.getZoneById(file.zone)
}
}
}
ZoneService.js
//Import services to use in resolvers
const UserService = require("./UserService.js")
const FileService = require("./FileService.js")
const EditService = require("./EditService.js")
const ErrorService = require("./ErrorService.js")
const FileUploadService = require("./FileUploadService.js")
//Resolvers
const resolvers = {
Query: {
getZone: (parent, {_id, label}) => {
return _id ? getZoneById(_id) : getZoneByLabel(label)
},
getZones: () => {
return getZones()
},
},
Zone: {
author: zone => {
return UserService.getUserById(zone.author)
},
files: zone => {
if(zone.files && zone.files.length > 0) return FileService.getFilesById(zone.files)
else return []
},
editHistory: zone => {
return EditService.getEditsById(zone.editHistory)
}
},
Mutation: {
createZone: async (parent, input, { loggedUser }) => {
return insertZone(input, loggedUser)
},
editZone: async (parent, input, { loggedUser }) => {
return editZone(input, loggedUser)
},
removeZone: async (parent, input, { loggedUser }) => {
return removeZone(input, loggedUser)
}
},
}
A couple of dos and don'ts:
Do split your schema into smaller modules. For most schemas, it makes sense to split the type definitions and resolvers across multiple files, grouping together related types and Query/Mutation fields. The resolvers and type definitions might be exported from a single file, or the type definitions might reside in a file by themselves (maybe a plain text file with a .gql or .graphql extension). (Note: Borrowing Apollo's terminology, I'm going to refer to the related type definitions and resolvers as a module).
Don't introduce dependencies between these modules. Resolvers should operate independently of one another. There's no need to call one resolver inside another -- and certainly no need to call one module's resolver from inside another module. If there's some shared logic between modules, extract it into a separate function and then import it into both modules.
Do keep your API layer separate from your business logic layer. Keep business logic contained to your data model classes, and keep your resolvers out of these classes. For example, your app should have a Zone model, or ZoneService or ZoneRepository that contains methods like getZoneById. This file should not contain any resolvers and should instead be imported by your schema modules.
Do use context for dependency injection. Any data models, services, etc. that your resolvers need access to should be injected using context. This means instead of importing these files directly, you'll utilize the context parameter to access the needed resource instead. This makes testing easier, and enforces a unidirectional flow of dependencies.
So, to sum up the above, your project structure might look something like this:
services/
zone-service.js
file-service.js
schema/
files/
typeDefs.gql
resolvers.js
zones/
typeDefs.gql
resolvers.js
And you might initialize your server this way:
const FileService = require(...)
const ZoneService = require(...)
const server = new ApolloServer({
typeDefs,
resolvers,
context: () => ({
services: {
FileService,
ZoneService,
}
})
})
Which means your resolver file would not need to import anything, and your resolvers would simply look something like:
module.exports = {
Query: {
getFileById: (parent, {_id}, {services: {FileService}}) => {
return FileService.getFileById(_id)
},
getFilesById: (parent, {ids}, {services: {FileService}}) => {
return FileService.getFilesById(ids)
},
getFilesByZoneId: (parent, {_id}, {services: {FileService}}) => {
return FileService.getFilesByZoneId(_id)
},
},
}
For better, you should avoid circular dependencies.
A simple way to do this is separated your module to smaller modules.
As
FileService -> CommonService
ZoneService -> CommonService
or
FileServicePartDependsOnZoneService -> ZoneService
ZoneService -> FileServicePartNotDependsOnZoneService
or
FileService -> ZoneServicePartNotDependsOnFileService
ZoneServicePartDependsOnFileService -> FileService
Note that is example. You should naming your module to meaningful and shorter than my example.
Another way is to merge them together. (But it may be a bad idea)
If you can not avoid circular dependencies. You also can require module when you need it instead of import.
For example:
//FileService.js
let ZoneService
function doSomeThing() {
if(!ZoneService) {
ZoneService = require("./ZoneService.js").default
//or ZoneService = require("./ZoneService.js")
}
//using ZoneService
}
For reusable, define a function getZoneService or something alternative
Related
I'm operating a bot on Wikipedia using npm mwbot, and planning to migrate to npm mwn. This is because you need a "token" to edit pages on Wikipedia, and this can expire after a while so you need to prepare your own countermeasures against this if you use mwbot, but it seems like mwn handles this issue on its own.
When you use mwn, you first need to initialize a bot instance as documented on the turotial:
const bot = await mwn.init(myUserInfo);
Then your token is stored in the bot instance and you can for example edit a page using:
const result = await bot.edit('Page title', () => {text: 'text'});
So, basically you want to share the initialized bot instance across modules. I believe it'd be easiest to declare a global variable like so:
// bot.js (main script)
const {mwn} = require('mwn');
const {my} = require('./modules/my');
(async() => {
global.mw = await mwn.init(my.userinfo);
const {t} = require('./modules/test');
t();
})();
// modules/test.js
/* global mw */
exports.t = async () => {
const content = await mw.read('Main page');
console.log(content);
return content;
};
I'm currently using JavaScript, but will hopefully migrate to TypeScript (although I'm new to it) because I feel like it'd be useful in developing some of my modules. But I'm stuck with how I should use the initialized bot instance across modules in TypeScript.
-+-- dist (<= where ts files are compiled into)
+-- src
+-- types
+-- global.d.ts
+-- bot.ts
+-- my.ts
// bot.ts
import {mwn} from 'mwn';
import {my} from './my';
(async() => {
global.mw = await mwn.init(my.userinfo);
})();
// global.d.ts
import {mwn} from 'mwn';
declare global {
// eslint-disable-next-line no-var
var mw: mwn;
}
This doesn't work and returns "Element implicitly has an 'any' type because type 'typeof globalThis' has no index signature. (at mw in global.mw)".
This is probably a naive question but any help would be appreciated.
Edit:
Thanks #CertainPerformance, that's a simple and easy approach. Actually, I once tried the same kind of an approach:
export const init = async () => {
if (typeof mw === 'undefined') {
return Promise.resolve(mw);
} else {
return mwn.init(my.userinfo).then((res) => {
mw = res;
return mw;
});
}
}
But I was like "init().then() in every module?"... don't know why I didn't come up with just exporting the initialized mwn instance.
Anyway, is it like the entry point file should be a .js file? I've been trying with a .ts file and this is one thing that's been giving me a headache. I'm using ts-node or nodemon to auto-compile .ts files, but without "type: module", "Cannot use import statement outside a module" error occurs and with that included, "TypeError [ERR_UNKNOWN_FILE_EXTENSION]: Unknown file extension ".ts"" occurs. How do you tell a given file should be a .js or .ts file?
Edit2:
Just making a note: The error I mentioned above was caused by not having "module": "CommonJS" in my tsconfig.json, as I commented to CertainPerformance's answer below.
One of the main benefits of modules is the ability to drop dependencies on global variables. Rather than going back on that and assigning a global anyway, a better approach that happens to solve your problem as well would be to have a module that exports two functions:
One that initializes the asynchronous mwn
One that returns mwn when called
// mw.ts
import {mwn} from 'mwn';
import {my} from './my';
let mw: mwn;
export const init = async () => {
mw = await mwn.init(my.userinfo);
};
export const getMw = () => mw;
Then it can be consumed by other modules quite naturally, barely requiring any typing at all:
// modules/index.ts
// Entry point
import { init } from './mw';
import { test } from './test';
init()
.then(() => {
test();
})
// .catch(handleErrors);
// modules/test.ts
import { getMw } from './bot';
export const test = () => {
const mw = getMw();
// anything else that depends on mw
};
The above could be a reasonable approach if mw is used in many places across your app and you don't think that passing it around everywhere would be maintainable.
If you could pass it around everywhere, that would have even less of a code smell, though.
// initMw.ts
import {mwn} from 'mwn';
import {my} from './my';
export const initMw = () => mwn.init(my.userinfo);
// modules/index.ts
// Entry point
import { initMw } from './initMw';
import { test } from './test';
initMw()
.then((mw) => {
test(mw);
})
// .catch(handleErrors);
// modules/test.ts
import { mwn } from 'mwn';
export const test = (mw: mwn) => {
// anything that depends on mw
};
Initialize it once, then pass it down (synchronously) everywhere it's needed - that's the approach I'd prefer.
You could put the mwn type in a global d.ts file to avoid having to add import { mwn } from 'mwn'; to every module if you wanted. (Yes, it's somewhat of a global variable, but it's a type rather than a value, so it's arguably less of a problem.)
In Jest, how do I avoid erasing static fields of imported and mocked services?
For example, here is an Awesomeness service:
class Awesomeness {
static get AWESOME_VALUE() {
return '42';
}
doSomethingAwesome() {
// Implementation
}
}
I'm importing this service as part of a bigger library and using Awesomeness.AWESOME_VALUE in my code.
Now I need to to unit-test my code, so I'm mocking the module:
const mockSomethingAwesome = jest.fn();
jest.mock('big-library', () => ({
...jest.requireActual('big-library'),
Awesomeness: jest.fn().mockImplementation(() => ({
doSomethingAwesome: mockSomethingAwesome
}))
}));
Now the whole module is mocked, so when running unit test Awesomeness.AWESOME_VALUE is undefined - which, obviously, breaks some internal logic.
At this point I don't really care if I get the original value in tests or have an ability to set a mocked/fake value - I just need a way to have it defined.
Any ideas, please?
I came across a problem where I need to include and evaluate a module in a bundle which is not imported somewhere in a next.js application.
Some Background information
Im currently working to implement a Java-esque Service Provider Interface pattern to register certain classes in typescript.
It is based around a class decorator which pushes the constructor of the wrapped class into a set.
// file: B.ts
#registerMe("classIdB")
class B {}
// file: C.ts
#registerMe("classIdC")
class C {}
// file: import.ts
const instance: C = getInstanceOf("classIdC") as C;
The big problem is now, that these decorated classes are not imported somewhere but instantiated dynamically on runtime. So i guess in the next.js or webpack build pipeline they get excluded or are not present in the first place.
What i tried so far:
(1) write a webpack plugin which flags classes with a certain path as used, following this implementation.
( i thought treeshaking could be the problem, which it was not )
class DisableTreeShakingForModule {
constructor(options) {
this.options = Object.assign({}, options)
}
apply(compiler) {
compiler.hooks.compilation.tap(PluginName, (compilation) => {
compilation.hooks.afterOptimizeTree.tap(PluginName, (chunks, modules) => {
const target = modules.filter((module) => module.resource && module.resource.includes("some-path/"))
if(!target) return;
target.forEach((m) => {
m.used = true
m.usedExports = true
})
})
})
}
}
(2) Add a custom entry point to my webpack config in next.config.js
( i thought i need to add those file as an extra entry point, but despite a bundle 'test' beeing created the files in it are not loaded and evaluated on runtime.
webpack(prevConfig, { dev, isServer }) {
return Object.assign({}, config, { entry: function() {
return config.entry().then((entry) => {
return Object.assign({}, entry, {"test": glob.sync("some-path/*.tsx")})
})
}});
}
(3) Idea: add these files to the next.js common bundle which is always imported (but i don't know how to do that)
I want to write functions for repeating tasks. So for example I want to have a file lib/dbFunctions to handle anything with my database, like searching the databse for a specific user.
so I just want to create the function searchUser(username) and just import my functions with e.g. const dbFunctions = require('lib/dbFunctions), so I can call dbFunctions.searchUser("ExampleUser");
How would I solve this in NodeJS?
I only found examples using the express.Router() looking like this:
const express = require('express');
const customRoutes = express.Router();
cusrtomRoutes.get('/login',function(req,res){
res.render('account/login');
});
module.exports = {"CustomRoutes" : customRoutes};
However I somehow couldn't figure out how to write a completly custom class with functions.
I usually do such kind of stuff by making a class such as UserRepository.
class UserRepository() {
static searchUser(user){
// your logic here
}
static getUserByEmail(email){
// your logic here
}
}
module.exports = UserRepository;
You can call this function from anywhere like this:
const UserRepository = require('./user-repository.js');
UserRepository.searchUser(user);
The advantage of above approach is that you can process other logic in constructor which would be require before talking to the database. This is OOP based approach basically.
The other way is (not OOP based):
const searchuser = (user) => {};
const getUserByEmail = (email) => {};
module.exports = {
searchuser,
getUserByEmail,
};
You can use in the same way:
const UserRepository = require('./user-repository.js');
UserRepository.search(user);
Sounds like you need to know how to make a module.
Say you have an index.js like this:
const dbFunctions = require('./lib/dbFunctions')
console.log(dbFunctions.searchUser('Dan'))
You can make your module work by creating a file at <project_root>/lib/dbFunctions.js which can look like this:
const searchUser = (name) => `found user ${name}`
module.exports = {searchUser}
Now if you run node index.js you'll see found user Dan print in your console.
It's important to note that when you're importing modules node expects a path like lib/dbFunctions to exist inside node_modules whereas when you're trying to import a local module within your project you need to specify the relative path from where you're importing hence I used ./lib/dbFunctions in my example.
Another common way to create a custom lib in your project is using a pattern called barrel files or barrel exports.
Instead of having a file called dbFunctions.js you'd create a directory at /lib/dbFunctions with an index.js inside the root of that directory. You can then seperate out all of your functions into their own modules like this for example:
// /lib/dbFunctions/searchUser.js
const searchUser = (name) => `found user ${name}`
module.exports = searchUser
// /lib/dbFunctions/searchPost.js
const searchPost = (id) => `found post ${id}`
module.exports = searchPost
And then your index.js file would look like this:
// /lib/dbFunctions/index.js
const searchUser = require('./searchUser');
const searchPost = require('./searchPost');
module.exports = {
searchUser,
searchPost
}
Then your main entry point (index.js)
const dbFunctions = require('./lib/dbFunctions')
console.log(dbFunctions.searchUser('Dan'))
console.log(dbFunctions.searchPost(2))
It seems you want to create a module consisting of a bunch of functions. If that's the case, a typical module of functions looks like this:
// file customFunctions.js
function custom1 () { ... }
function custom2 () { ... }
...
module.exports = { custom1, custom2, ... }
The module exports an object containing functions as properties.
Then, just require that file where you need it:
const customFns = require('./customFunctions');
customFns.custom1();
This is a trivial example that illustrates the crux of my problem:
var innerLib = require('./path/to/innerLib');
function underTest() {
return innerLib.doComplexStuff();
}
module.exports = underTest;
I am trying to write a unit test for this code. How can I mock out the requirement for the innerLib without mocking out the require function entirely?
So this is me trying to mock out the global require and finding out that it won’t work even to do that:
var path = require('path'),
vm = require('vm'),
fs = require('fs'),
indexPath = path.join(__dirname, './underTest');
var globalRequire = require;
require = function(name) {
console.log('require: ' + name);
switch(name) {
case 'connect':
case indexPath:
return globalRequire(name);
break;
}
};
The problem is that the require function inside the underTest.js file has actually not been mocked out. It still points to the global require function. So it seems that I can only mock out the require function within the same file I’m doing the mocking in. If I use the global require to include anything, even after I’ve overridden the local copy, the files being required will still have the global require reference.
You can now!
I published proxyquire which will take care of overriding the global require inside your module while you are testing it.
This means you need no changes to your code in order to inject mocks for required modules.
Proxyquire has a very simple api which allows resolving the module you are trying to test and pass along mocks/stubs for its required modules in one simple step.
#Raynos is right that traditionally you had to resort to not very ideal solutions in order to achieve that or do bottom-up development instead
Which is the main reason why I created proxyquire - to allow top-down test driven development without any hassle.
Have a look at the documentation and the examples in order to gauge if it will fit your needs.
A better option in this case is to mock methods of the module that gets returned.
For better or worse, most node.js modules are singletons; two pieces of code that require() the same module get the same reference to that module.
You can leverage this and use something like sinon to mock out items that are required. mocha test follows:
// in your testfile
var innerLib = require('./path/to/innerLib');
var underTest = require('./path/to/underTest');
var sinon = require('sinon');
describe("underTest", function() {
it("does something", function() {
sinon.stub(innerLib, 'toCrazyCrap').callsFake(function() {
// whatever you would like innerLib.toCrazyCrap to do under test
});
underTest();
sinon.assert.calledOnce(innerLib.toCrazyCrap); // sinon assertion
innerLib.toCrazyCrap.restore(); // restore original functionality
});
});
Sinon has good integration with chai for making assertions, and I wrote a module to integrate sinon with mocha to allow for easier spy/stub cleanup (to avoid test pollution.)
Note that underTest cannot be mocked in the same way, as underTest returns only a function.
Another option is to use Jest mocks. Follow up on their page
I use mock-require. Make sure you define your mocks before you require the module to be tested.
Simple code to mock modules for the curious
Notice the parts where you manipulate the require.cache and note require.resolve method as this is the secret sauce.
class MockModules {
constructor() {
this._resolvedPaths = {}
}
add({ path, mock }) {
const resolvedPath = require.resolve(path)
this._resolvedPaths[resolvedPath] = true
require.cache[resolvedPath] = {
id: resolvedPath,
file: resolvedPath,
loaded: true,
exports: mock
}
}
clear(path) {
const resolvedPath = require.resolve(path)
delete this._resolvedPaths[resolvedPath]
delete require.cache[resolvedPath]
}
clearAll() {
Object.keys(this._resolvedPaths).forEach(resolvedPath =>
delete require.cache[resolvedPath]
)
this._resolvedPaths = {}
}
}
Use like:
describe('#someModuleUsingTheThing', () => {
const mockModules = new MockModules()
beforeAll(() => {
mockModules.add({
// use the same require path as you normally would
path: '../theThing',
// mock return an object with "theThingMethod"
mock: {
theThingMethod: () => true
}
})
})
afterAll(() => {
mockModules.clearAll()
})
it('should do the thing', async () => {
const someModuleUsingTheThing = require('./someModuleUsingTheThing')
expect(someModuleUsingTheThing.theThingMethod()).to.equal(true)
})
})
BUT... jest has this functionality built in and I recommend that testing framework over rolling your own for testing purposes.
Mocking require feels like a nasty hack to me. I would personally try to avoid it and refactor the code to make it more testable.
There are various approaches to handle dependencies.
1) pass dependencies as arguments
function underTest(innerLib) {
return innerLib.doComplexStuff();
}
This will make the code universally testable. The downside is that you need to pass dependencies around, which can make the code look more complicated.
2) implement the module as a class, then use class methods/ properties to obtain dependencies
(This is a contrived example, where class usage is not reasonable, but it conveys the idea)
(ES6 example)
const innerLib = require('./path/to/innerLib')
class underTestClass {
getInnerLib () {
return innerLib
}
underTestMethod () {
return this.getInnerLib().doComplexStuff()
}
}
Now you can easily stub getInnerLib method to test your code.
The code becomes more verbose, but also easier to test.
If you've ever used jest, then you're probably familiar with jest's mock feature.
Using "jest.mock(...)" you can simply specify the string that would occur in a require-statement in your code somewhere and whenever a module is required using that string a mock-object would be returned instead.
For example
jest.mock("firebase-admin", () => {
const a = require("mocked-version-of-firebase-admin");
a.someAdditionalMockedMethod = () => {}
return a;
})
would completely replace all imports/requires of "firebase-admin" with the object you returned from that "factory"-function.
Well, you can do that when using jest because jest creates a runtime around every module it runs and injects a "hooked" version of require into the module, but you wouldn't be able to do this without jest.
I have tried to achieve this with mock-require but for me it didn't work for nested levels in my source. Have a look at the following issue on github: mock-require not always called with Mocha.
To address this I have created two npm-modules you can use to achieve what you want.
You need one babel-plugin and a module mocker.
babel-plugin-mock-require
jestlike-mock
In your .babelrc use the babel-plugin-mock-require plugin with following options:
...
"plugins": [
["babel-plugin-mock-require", { "moduleMocker": "jestlike-mock" }],
...
]
...
and in your test file use the jestlike-mock module like so:
import {jestMocker} from "jestlike-mock";
...
jestMocker.mock("firebase-admin", () => {
const firebase = new (require("firebase-mock").MockFirebaseSdk)();
...
return firebase;
});
...
The jestlike-mock module is still very rudimental and does not have a lot of documentation but there's not much code either. I appreciate any PRs for a more complete feature set. The goal would be to recreate the whole "jest.mock" feature.
In order to see how jest implements that one can look up the code in the "jest-runtime" package. See https://github.com/facebook/jest/blob/master/packages/jest-runtime/src/index.js#L734 for example, here they generate an "automock" of a module.
Hope that helps ;)
You can't. You have to build up your unit test suite so that the lowest modules are tested first and that the higher level modules that require modules are tested afterwards.
You also have to assume that any 3rd party code and node.js itself is well tested.
I presume you'll see mocking frameworks arrive in the near future that overwrite global.require
If you really must inject a mock you can change your code to expose modular scope.
// underTest.js
var innerLib = require('./path/to/innerLib');
function underTest() {
return innerLib.toCrazyCrap();
}
module.exports = underTest;
module.exports.__module = module;
// test.js
function test() {
var underTest = require("underTest");
underTest.__module.innerLib = {
toCrazyCrap: function() { return true; }
};
assert.ok(underTest());
}
Be warned this exposes .__module into your API and any code can access modular scope at their own danger.
You can use mockery library:
describe 'UnderTest', ->
before ->
mockery.enable( warnOnUnregistered: false )
mockery.registerMock('./path/to/innerLib', { doComplexStuff: -> 'Complex result' })
#underTest = require('./path/to/underTest')
it 'should compute complex value', ->
expect(#underTest()).to.eq 'Complex result'
I use a simple factory the returns a function that calls a function with all of its dependencies:
/**
* fnFactory
* Returns a function that calls a function with all of its dependencies.
*/
"use strict";
const fnFactory = ({ target, dependencies }) => () => target(...dependencies);
module.exports = fnFactory;
Wanting to test the following function:
/*
* underTest
*/
"use strict";
const underTest = ( innerLib, millions ) => innerLib.doComplexStuff(millions);
module.exports = underTest;
I would setup my test (I use Jest) as follows:
"use strict";
const fnFactory = require("./fnFactory");
const _underTest = require("./underTest");
test("fnFactory can mock a function by returng a function that calls a function with all its dependencies", () => {
const fake = millions => `Didn't do anything with ${millions} million dollars!`;
const underTest = fnFactory({ target: _underTest, dependencies: [{ doComplexStuff: fake }, 10] });
expect(underTest()).toBe("Didn't do anything with 10 million dollars!");
});
See results of test
In production code I would manually inject the callee's dependencies as below:
/**
* main
* Entry point for the real application.
*/
"use strict";
const underTest = require("./underTest");
const innerLib = require("./innerLib");
underTest(innerLib, 10);
I tend to limit the scope of most of the modules that I write to one thing, which reduces the number of dependencies that have to be accounted for when testing and integrating them into the project.
And that's my approach to dealing with dependencies.