I have been wondering about how to write down function in migration file. Ideally, it should be exactly opposite of what we are doing in up method. Now suppose I wrote up function to drop unique constraint on a column, added some new rows(having duplicate data) to a table and now I want to rollback the migration. Ideally, I would write down method to add a unique constraint again on the column but migration would not rollback as a table now contains duplicate data.
So my questions are -
What to do in such a situation?
How to write down function in migrations?
Can I keep the down function blank in such a situations?
Thanks.
I usually don't write down functions at all and just leave them empty.
I never rollback migrations and if I want to get to earlier DB state I just restore whole DB from backups.
If I just want to put unique constraint back, I will write another up migration which fixes duplicate rows and then adds unique constraint back.
I know that many people is using rollback between tests to reset DB, but that is really slow way to do it.
Related
Is it possible to revert a specific migration in Typeorm?, I want to only revert a particular migration and not all till I get to the migration I want to revert,
Because normally you just call typeorm migration: revert multiple times and it starts reverting from the last executed and removing it from the database, if you want to revert multiple migrations.
If you are really sure with reverting specific migration before some others.
You might try tweaking its id value on the migration Table.
If you have a table update you want to change that is not related to the last migration committed then you should write a new migration to make the change.
Reverting any migration is a last resort operation that is available to you when things don't go as planned, but I find that most problems can be solved forward with new migrations rather than reverting back.
Also if you find your migrations are too large, rebase your migrations. You can remove all migrations and generate a single base migration that creates the database as is current. We find this useful to do after a long period of time as migrations become redundant overtime.
My question I know ahead of time may not be a "best practice" but I promise you I won't understand the "best practice" until I can implement the question at hand and then understand why it is inefficient.
I work with react tables and I'm trying to create helper functions based on Cypress.Promise. I've done so in the past at a previous job but I never internalized the logic behind Promises and consequentially forgot how to get values, pull strings and then return the list to a variable for the rest of the tests. And while it sounds simple to describe I'm having a difficult/frustrating time remembering how to implement it.
From there the plan would be to sort the list (asc & desc) via UI & confirm with Cypress.
Search a random name in the list and validate it's the only value in the list
I forgot...
But I just need a function that like let table_strings = CyPromiseFunction("table_element")
That returns a list like ["Slack","Meet","Hangouts","Messenger","Teams"]
That I can filter & search off of.
It's not super clear what it is you're trying to achieve, but it sounds like you want to grab table data and use that data to validate against filtering, sorting, etc. If this is not the case, please update your answer to be more specific.
If this is what you are trying to do, then I am not sure why you need to use Cypress.Promise to achieve this.
If your React table uses tbody/theader/tr/td type structure, you could probably use the Cypress-Get-Table plugin to grab your table data. If nothing else, reviewing the source code could help put you on the right track.
I'm currently trying to modify the selection order of some records using a javascript drag&drop mechanism.
This is the idea:
Once I've ordered the elements by d&d I retrieve the IDs of each element (in the right order) and I send them to php via ajax call.
I store the array of IDs somewhere (to develop)
Then, I run a query like this:
$sql = "SELECT * FROM items ORDER BY field(id, ".$order.");";
(where $order is the imploded array of IDs)
It works quite good but, since I never used this feature before, my doubt is:
since my IDs are strings of 16 characters, and supposing to have 200 records to order....
...Should I expect some trouble in therms of performance?
Do you see any better solution?
Thanks.
The comments up there made me think and I realized that this approach has a big issue.
Even considering to send the $order array only at the end of drag&drop process - I mean, push a button (unlock d&d), reorder, confirm (send&lock) - it would be however necessary to perform a custom select on every single js action comporting a refresh of the elements (view, create, rename,...). And that's pretty dumb.
So I guess that the best approach is the one suggested by Kiko, maybe with a lock system as described above to avoid an ajax call and consequent reindexing of the order field at every single move.
Trying to count how many child nodes a particular node contains on my database
I am planning on having many users, and want to the expierence to be as fast as possible, so I know I don't want to download the parent node and count
I've thought of simply having a counter field stored, and everytime a user does something to add to that parent also incriment that value.. however I am pretty inexpierenced with this and am worried
that somehow two users adding something at the same time or so will cause that value to be incorrect..which from my reading is whata transaction operation is create for
I remeber when I used to use Parse a while ago, there was something called CloudCode that would constantly run on the server and in particular I would use it for maintence operations on the database
Would running a transaction operation be the solution here? Currious to hear how others handle stuff like this.. do they have some sort of monitering server maintaining it
I wrote some code in javascript that manipulates field info for my contact form. The code is triggered at OnLoad and OnSave. It works well but the company has 5000+ records that need this code to be applied to it.
Is there a way to write code or plug-in or workflow that would simply load the record and then close it that I could use on all records to apply the javascript or do I have to load each record individually?
You could do one the following depending on what you find the easiest:
Console Application (Personal Preference and the easiest) - Get the record unique numbers if you have something set up or GUIDs of the records
you want updated
Plugin (Some sort of record update required to trigger your logic) - add a throw away field - Register the plugin on update of the throw away field, use bulk update and update the field on all the records you want the plugin to run against. Delete the throw away field.
Workflow (custom workflow activity or out of the box depends on the complexity of the manipulation, no record update required to trigger your logic) - Create a workflow/custom workflow activity and run it against all the records - it will probably take an eternity for you to get all 5000 updated at a maximum of 250 a go.
It sounds like one of the easiest ways to do what you want is to simply export the desired records and fields, and then manipulating them in Excel. You would apply the same logic in Excel as you have in your form JavaScript.
Once you're done, just import them back into CRM and the existing records should be updated.
The most important thing to pay attention to is that you export only after selecting the option "Make this data available for re-importing by including required column headings".