I have a TinyMCE button and it works apart from passing the data from the media gallery into a variable (to edit into the content).
I have the following:
window.on('select', function(){
var files = window.state().get('selection').toArray();
console.log(files.id);
});
which doesn't work, but if I change too:
window.on('select', function(){
var files = window.state().get('selection').toArray();
console.log(files);
});
I get "array (object)" in the console.log and by opening the object I can see id is one of the "fields" available and has a value.
The basic idea is the button (before this code) opens a media library (WordPress) and on selection of the images, it passes the ID's of the images to TinyMCE to print them (later) and the only part that's not working is the above.
Anyone able to point me in the right direction (vanilla js not my forte and first time using tinymce)
You simply have to iterate the files because there are multiple file in this array.
Try this code.
window.on('select', function(){
var files = window.state().get('selection').toArray();
var images = files;
for (var k in files) {
var file = files[k];
console.log(file.id);
}
});
Related
I am building a website with several HTML pages, and going to fill up info on different pages through an API. I have added onclick listeners to HTML elements like this:
// ASSIGNING ELEMENTS AS VARIABLES
const EPL = document.getElementById('epl');
const bundesliga = document.getElementById('bundesliga');
const laliga = document.getElementById('laliga');
// ONCLICKS
EPL.onclick = function() {
getStandings('2021');
location.replace('standings.html');
}
bundesliga.onclick = function() {
getStandings('2088');
location.replace('standings.html');
}
laliga.onclick = function() {
getStandings('2224');
location.replace('standings.html');
}
When one of these is clicked, I call a function (getStandings) with its unique argument to fetch some data from the API. I also want to move to another HTML page, for which I used location.replace.
I'm caught in a dilemma: if I use the same JS file for every HTML page, when I get to the new HTML page, I get errors as the new HTML page does not have every element:
main.js:41 Uncaught TypeError: Cannot set property 'onclick' of null
But if I use different JS files, maybe one JS file for each HTML file, I cannot carry forward the bits of information I need. How can I get to the new HTML page, with its own JS file, without stopping and losing everything in the function I'm in currently, under the JS file of the old page? For example, the argument '2021' or '2088' are to be passed into the getStandings() function which will populate the new HTML page with data from an API. If I jump to a new HTML page with a new JS file, this is lost.
Is there a better way to organise my files? 😐😐😐😐😐
You can set your event listeners on the condition that the elements are not null e.g.
const EPL = document.getElementById('epl');
const bundesliga = document.getElementById('bundesliga');
const laliga = document.getElementById('laliga');
if(EPL){
EPL.onclick = function() {
getStandings('2021');
location.replace('standings.html');
}
}
etc...
Solved! As amn said, I can add URL parameters to the end of the URL of the new HTML page, then get the variables from its own URL once I'm on the new HTML page.
I think I would rather use classes instead of IDs to define the listener, and maybe IDs for dedicated action.
I am developing a web page in an internal tool that uses IE11. The tool has different infrastructure where not all native Javascript code works. We have included the jQuery library.
The issue is with reading the file from the file input element. After I browse and select a file, the code is able to read the inputFile element with no issues:
var selectedFile = $('#inputFile');
However, it does not find any files under this element:
if (selectedFile.files.length > 0)
I tried other alternatives which do not work too:
var input = document.getElementById('inputFile');
var file = input.files[0];
OR
var x = document.getElementById("inputFile");
if ('files' in x) {}
selectedFile is a jQuery object, whereas the files collection only exists on the Element object within that.
Either of these approaches will retrieve the files collection from the Element within the jQuery object:
// #1
var $selectedFile = $('#inputFile');
if ($selectedFile[0].files.length > 0) {
// do something...
}
// #2
var $selectedFile = $('#inputFile');
if ($selectedFile.prop('files').length > 0) {
// do something...
}
Note that I prefixed the selectedFile variable with a $. This is standard practice when working with jQuery as it makes it clear that the variable holds a jQuery object, not an Element.
For many reasons it isn't always ideal to install a plugin on multiple sites just to copy content. For pages / posts it's really easy to grab the post content and copy+paste (even when using page builders the content is available in an underlying textarea). Menus on the other hand are a pain. What is your goto method for copying menus?
This is what I whipped up today to solve copying WordPress menus via JavaScript / without a plugin.
First navigate to the menu you want to copy and paste this into the console.
var items = jQuery('#menu-to-edit li');
var json = [];
jQuery.each(items, function(i, item) {
item = jQuery(item);
var title = item.find('.edit-menu-item-title').val();
var url = item.find('.edit-menu-item-url').val();
var classes = item.find('.edit-menu-item-classes').val();
var description = item.find('.edit-menu-item-description').val();
var menuitem = {"title" : title, "url" : url, "classes" : classes, "description" : description};
json.push(menuitem);
});
JSON.stringify(json);
Copy that output and navigate to the new menu - paste the content into the parse statement below:
var json = JSON.parse('PASTED DATA HERE');
function addItem(item)
{
jQuery('#custom-menu-item-url').val(item.url);
if (item.url == "")
{
item.url = "#";
}
jQuery('#custom-menu-item-name').val(item.title);
jQuery('#submit-customlinkdiv').click();
}
Then simply shift elements out of that json element and into the addItem function.
addItem(json.shift());
Rinse and repeat until your items are added.
There is lots of room for optimization on this e.g. it could add the classes / descriptions, re-order the menu items in the correct depth, add extended properties or monitor the add menu item form and automatically add the next item for you - could become a handy bookmarklet to beat back throwaway plugins like those that do simple menu copies.
I'm trying to use PDF.js' viewer to display pdf files on a page.
I've gotten everything working, but I would like to be able to 'jump to' a specific page in the pdf. I know you can set the page with the url, but I would like to do this in javascript if it's possible.
I have noticed that there is a PDFJS object in the global scope, and it seems that I should be able to get access to things like page setting there, but it's a rather massive object. Anyone know how to do this?
You can set the page via JavaScript with:
var desiredPage = [the page you want];
PDFViewerApplication.page = desiredPage;
There is an event handler on this, and the UI will be adjusted accordingly. You may want to ensure this is not out of bounds:
function goToPage(desiredPage){
var numPages = PDFViewerApplication.pagesCount;
if((desiredPage > numPages) || (desiredPage < 1)){
return;
}
PDFViewerApplication.page = desiredPage;
}
In my case I was loading pdf file inside iframe so I had to do it in other way around.
function goToPage(desiredPage){
var frame_1 = window.frames["iframe-name"];
var frameObject = document.getElementById("iframe-id").contentWindow;
frameObject.PDFViewerApplication.page = desired page;
}
if Pdf shown into iframe and you want to navigate to page then use below code. 'docIfram' is iframe tag Id.
document.getElementById("docIframe").contentWindow.PDFViewerApplication.page=2
I had help with this background carousel made with jquery for a website and it works just great.. except that i find that the page can take a while to load initially.. i thought that if i actually downloaded the pictures that i'm using for the background instead of loading them via 'http://www.whatever.jpg', that the page might load faster.. but i'm kind of a noob still.. and haven't been able to figure out why this isn't working.. Here is my code:
var images = [
//even though I downloaded the picture and its in the same folder as this file.js, the background just loads a black page, then the other 2 load fine.
"bg1-orig.jpg",
"http://www.desktopaper.com/wp-content/uploads/Cool-Hd-Wallpapers-2.jpg",
"http://wallpaperscraft.com/image/restaurant_table_interior_modern_style_39288_1920x1080.jpg"
];
var $body = $("body"),
$bg = $("#bg"),
n = images.length,
c = 0; // Loop Counter
num = 200;
// Preload Array of images...
for(var i=0; i<n; i++){
var tImg = new Image();
tImg.src = images[i];
}
$body.css({backgroundImage : "url("+images[c]+")"});
(function loopBg(){
$bg.hide().css({backgroundImage : "url("+images[++c%n]+")"}).delay(7000).fadeTo(2200, 1, function(){
$body.css({backgroundImage : "url("+images[c%n]+")"});
loopBg();
});
}());
i've searched around for a while now... thanks for the help!
You do yourself no favors trying to pre-load the images right before they are loaded for display in your CSS. In either case, the images have to be loaded first before you can see them, so there is going to be a delay regardless.
If the only thing that you want to change is the src attribute of the imgs to be a local folder, then I assume the only change to otherwise working code is the strings in your images array point to local files.
If that's the case, and if you want to resolve the location correctly without adding some sort of directory changes (../, img/ or the like), then you will need those images to be in the same directory as the html file, not the file.js file.