Monday, 12 June 2017

node mssql preparestatment output identity id

I'm using https://www.npmjs.com/package/mssql

I'm using sql.PreparedStatement to do the insert and I would like to use the request.output('id', sql.Int) to get my identity id(auto increasing ID). But I'm getting null value.

query:

INSERT INTO queue (refid, name, channel) VALUES (@refid, @name, @channel)

If I use request.output('refid', sql.Int) it was able to get the refid.

If I use OUTPUT INSERTED.id the id will store in recordset as below:

<
  recordsets[[object]],
  recordset[<id:xx>],
  output<refid:123>,
  rowsAdected: [1],
  returnValue: 0
>

What I prefer is:

<
  recordsets[[object]],
  recordset[],
  output<id:999, refid:123>,
  rowsAdected: [1],
  returnValue: 0
>



via hendry91

NodeJS - Crytography (S.S.E.)

Someone can help me to do the inverted operation of this? http://www.paranoiaworks.mobi/sse/file_encryption_specifications.html

I have encrypted a file on Android with an app that uses this specification (the algorithm was AES-256). I want to decrypt the file using node-js, the crypto module only is not sufficient, because there is not the necessary functions to derive the key like the app did. There is some library that I can use to do that?

PS: I need to decrypt it using node-js and not using their app for desktop because it's a very sensible file and I need to work with it only on memory.

Thanks in advance.



via Murillo Brandão

Implement helmet-csp on individual routes

I'm creating a sample Express app to demonstrate Content-Security-Policy (CSP) headers and am trying to use helmet-csp.

All of the documentation for helmet-csp shows it used as standard third-party-middleware via app.use(csp({ ... })) - this adds the CSP headers to every route in my application, but I want to customize it on individual routes.

Sample App

var express = require('express');
var http = require('http');
var csp = require('helmet-csp');
var app = express();

app.use(csp({
    directives: {
        frameSrc: ["'none'"]
    }
}));

app.get('/', (request, response) => {
    response.send('hi, :wave: =]');
});

app.get('/frameable', (request, response) => {
    response.send('you can frame me!');
});

http.createServer(app).listen(80, (err) => {
    if (err) {
        return console.log('error', err);
    }
});

With the above, every route receives the CSP header:

Content-Security-Policy: frame-src 'none'

In the /frameable route, I would want to override this CSP header to be:

Content-Security-Policy: frame-src 'self'

Whenever I need/want to customize a header set by helmet-csp on a per-route basis, do I need to manually override them inside each app.get with a line such as:

response.setHeader('Content-Security-Policy', "frame-src 'self'");

Or is there a way to do this via helmet-csp itself?



via newfurniturey

Within Express, how can I add core modules to my package.json dependencies so I can require them in my js files?

I'm trying to use the core module 'connect' and, if I understand the situation correctly, express uses that module internally. I want to require it manually, and I've tried to insert it into my dependencies within my package.json file. Maybe I'm not installing it correctly with my git bash? I'm taking shots in the dark here, but here are my dependencies within the file.

"dependencies": {
"connect": "2.4.2",
"accepts": "~1.3.3",
"array-flatten": "1.1.1",
"content-disposition": "0.5.2",
"content-type": "~1.0.2",
"cookie": "0.3.1",
"cookie-signature": "1.0.6",
"debug": "2.6.7",
"depd": "~1.1.0",
"encodeurl": "~1.0.1",
"escape-html": "~1.0.3",
"etag": "~1.8.0",
"finalhandler": "~1.0.3",
"fresh": "0.5.0",
"merge-descriptors": "1.0.1",
"methods": "~1.1.2",
"on-finished": "~2.3.0",
"parseurl": "~1.3.1",
"path-to-regexp": "0.1.7",
"proxy-addr": "~1.1.4",
"qs": "6.4.0",
"range-parser": "~1.2.0",
"send": "0.15.3",
"serve-static": "1.12.3",
"setprototypeof": "1.0.3",
"statuses": "~1.3.1",
"type-is": "~1.6.15",
"utils-merge": "1.0.0",
"vary": "~1.1.1"

}



via Forrest Carlton

gulp-replace from template and project source and merge into single file

Alright, one for the ages. The issue is I am trying to merge the structure from multiple sources into a single one. One of my sources is from the template.html which contains the base structure for index.html -source-. I trying to find a solution that replaces html block data from the source -index.html- body tag into the destination -template.html- lastly create a new merged document which outputs to -dest-.

File structure that is as follows:

  • public
  • ---assets
  • ------template.html
  • ---src
  • ------app.js
  • ------index.html
  • ------main.css
  • ---dist
  • ------merged.html

I am using npm package gulp-replace (but open to other packages).

var gulp = require('gulp');
var htmlreplace = require('gulp-html-replace');
var copycat = require('gulp-copycat');

gulp.task('default', function() {
    // create a watch task
    gulp.task('default', function() {
        gulp.src('assets/template.html')
            .pipe(htmlreplace({
                'cssInline': {
                    src: gulp.src('sheetgmail/main.css'),
                    tpl: '<style>%s</style>'
                },
                'body': gulp.src('sheetgmail/index.html'),
                'headscript': {
                    src: null,
                    tpl: '%s'
                },
                'js': {
                    src: gulp.src('sheetgmail/app.js'),
                    tpl: "<script type='text/javascript'>%s</script>"
                },
            }, {
                keepUnassigned: false,
                keepBlockTags: true,
                resolvePaths: false
            }))
            .pipe(gulp.dest('dist/'));
    });
});



via Isaiah Monroe Davis

couldn't node-gyp rebuild (ERROR binding.cc : No such)

I'm trying node-gyp rebuild but get fatal error.

c1xx : fatal error C1083: Cannot open source file: '..\src\binding.cc': No 
such
gyp ERR! build error
gyp ERR! stack Error: `C:\Program Files (x86)\MSBuild\14.0\bin\msbuild.exe` 
fail
gyp ERR! stack     at ChildProcess.onExit 
(C:\Users\Admin\AppData\Roaming\npm\no
gyp ERR! stack     at emitTwo (events.js:106:13)
gyp ERR! stack     at ChildProcess.emit (events.js:191:7)
gyp ERR! stack     at Process.ChildProcess._handle.onexit 
(internal/child_proces
gyp ERR! System Windows_NT 10.0.15063
gyp ERR! command "C:\\Program Files\\nodejs\\node.exe" 
"C:\\Users\\Admin\\AppDat
gyp ERR! cwd C:\Users\Admin
gyp ERR! node -v v6.11.0
gyp ERR! node-gyp -v v4.0.0
gyp ERR! not ok

I use Python 2.7 node-gyp -v v4.0.0 node -v v6.11.0

and node-gyp configure will warn with missing input binding.cc.



via L.seonbeen

Wordpress API with Express

Is it possible to make an external http get request from Wordpress API using express?

Let's say I want to make a get request to http://demo.wp-api.org/wp-json/wp/v2/posts - This are a list of posts from wordpress.

Sample:

router.get('/posts', function(req, res){
     I should make an external http request here from wordpress api
     ("http://demo.wp-api.org/wp-json/wp/v2/posts")

     Then I want to display the response as json
}



via Sherwin Ablaña Dapito

One of the problems with using Expo development when the React native

When I was in the use of Expo development project, the first execution NPM run ios will be submitted to the errors, and then execute once is normal.

Excuse me, what's the solution?

As shown in figure:

Error screenshot

Error is as follows: › npm run ios

react-native-starterkit@0.1.0 ios /Users/****/work/react-native-starterkit

react-native-scripts ios

10:09:56: Starting packager...

10:10:09: Starting simulator...

10:10:30: Failed to start simulator:

Error: Process exited with non-zero code: 60

Exiting...



via 祝梓毅

How to enable print media emulation in headless Chrome?

Is there a way to enable simulated device mode or emulated print media mode in headless Chrome in Linux?

It can be done manually in DevTools like so:

Enable print media emulation

The goal is to take a full-page screenshot in emulated print media mode without injecting or modifying any CSS. I'm already able take screenshots of web pages via Node.js, but not in emulated print media mode. I've searched, but am also unable to find a helpful CLI switch.

Example:StackOverflow print emulation

How to do this programmatically via CLI or Node.js? Is it even possible?



via Drakes

Node.js - How to return after all asynchronous calls have finished

As shown bellow, I am pushing the object link_to_json returns into an array allShirts declared in html_to_json.

However, the console.dir on the third last line and the return value of html_to_json logs an array of undefined references. Which I presume is because console.dir and return is executed before link_to_json functions finished.

How do I ensure the return value of html_to_json is a filled up allShirts array?

//Go to individual links and scrape relevant info
const link_to_json = (link) => {
    request(link, (err, res, body) => {
        if (!error_handler(err, res, link)) {
            const $ = cheerio.load(body);
            const shirt_detail = $('.shirt-details').find('h1').text();

            const Title = shirt_detail.substr(shirt_detail.indexOf(' ') + 1);
            const Price = shirt_detail.substr(0, shirt_detail.indexOf(' '));
            const ImageURL = $('.shirt-picture').find('img').attr('src');
            const URL = link;

            return new Shirt(Title, Price, ImageURL, URL);
        } else return {};
    });
}

//Crawl through all individual links listed in Root
const html_to_json = body => {
    const allShirts = [];
    const $ = cheerio.load(body);

    $('.products').find('a').each((index, val) => {
        allShirts.push(link_to_json(rootURL + $(val).attr('href')));
    });

    console.dir(allShirts); // <--- HERE
    return allShirts;
}



via Ja. L

Having an issue with nodemailer and postfix

I'm receiving this error when I try to run nodemailer

{ Error: 139776600639296:error:140770FC:SSL routines:SSL23_GET_SERVER_HELLO:unknown protocol:../deps/openssl/openssl/ssl/s23_clnt.c:794:
code: 'ECONNECTION', command: 'CONN' }

as shown in this snippet

'use strict';
const nodemailer = require('nodemailer');

// create reusable transporter object using the default SMTP transport
let transporter = nodemailer.createTransport({
    host: 'smtp.localhost',
    port: 465,
    secure: true, // secure:true for port 465, secure:false for port 587
});

// setup email data with unicode symbols
let mailOptions = {
    from: '"Fred Foo 👻" <foo@blurdybloop.com>', // sender address
    to: 'bar@blurdybloop.com, baz@blurdybloop.com', // list of receivers
    subject: 'Hello ✔', // Subject line
    text: 'Hello world ?', // plain text body
    html: '<b>Hello world ?</b>' // html body
};

// send mail with defined transport object
transporter.sendMail(mailOptions, (error, info) => {
    if (error) {
        return console.log(error);
    }
    console.log('Message %s sent: %s', info.messageId, info.response);
});

I cannot figure out here I'm going wrong with this. I left off the auth info on purpose as it said it would assume it had been authenticated already. Any help would be greatly appreciated - I've been scratching my head at this for days. Thanks!



via Zach Hill

Automatically Emit watchPosition() with socket.io without JS Interval

I'm looking for a way to listen in for when watchPosition() gets a new gps coordinate and then emit this automatically without using a javascript interval to blindly keep emiiting the values of watchPosition() with or without a change in the device's location.

Basically, update the latitude and longitude only when the device's location changes.

Thanks in advance!



via William Ebuka Okafor

node.js not printing to console

I am trying to use sockets in my api. However I could not see the logs in the console.

The only output I see is:

[nodemon] restarting due to changes...
[nodemon] starting `node server.js`
We are live on8000

Here is my server.js file:

// server.js
const express        = require('express');
const MongoClient    = require('mongodb').MongoClient;
const bodyParser     = require('body-parser');
const app            = express();
const port           = 8000;

var server = require('http').createServer(app);
var io = require('socket.io')(server);

const db = require('./config/db');

app.use(express.static(__dirname + '/../app'));
app.use(bodyParser.urlencoded({ extended: true }));


MongoClient.connect(db.url,(err,database) =>{

    if (err) return console.log(err);

    //check below line changed
     require('./app/routes')(app, database);
    app.listen(port,() => {
        console.log("We are live on"+port);
    });

    app.get('/', function(req, res){
      res.sendFile(__dirname + '/index.html');
    });


    io.on('connection',function(socket){
      console.log('client connected');

      socket.on('disconnect', function () {
        console.log('disconnect');
      });

    });

})

and the index.html is:

<!DOCTYPE html>
<html>
  <head><title>Hello world</title></head>
  <script src="/socket.io/socket.io.js"></script>
<script>
  var socket = io();
</script>
  <body>Hello world</body>
</html>

I see Hello world on the web browser, but I could not see 'client connected' on the console log.



via Teja Nandamuri

keep string between two words node js express

I have a string like:

var str = saGDJGJDJGlgk abc_start asjdgjgjgdfjakgfja abc_end csjhgfhsgfgfugvgjdj abc_start djkhfjwhjfgwjgkfvvg abc_end.

I'm displaying this string in my browser using:

res.send('/page/+result);

I want to filter out result such that only the content which starts at abc_start and end at abc_end remains. How do I do that in node.js?

For eg: output: abc_start asjdgjgjgdfjakgfja abc_end abc_start djkhfjwhjfgwjgkfvvg abc_end

I tried using : str.split('abc_start').pop().split('abc_end').shift();

But I'm not gettting desired output.Please help.



via user8151624

How do you pass a filename to the next action in a Gulp pipeline using gulp-tap?

I have a Gulp task which takes an HTML file and inlines styles taken from a CSS file using gulp-inline-css. The original version of my task used the same CSS file for each HTML file. Now I would like to have the task choose a CSS file based on the filename of the HTML file it is processing.

I am using gulp-tap to get the filename. The inliner() function takes the path to the CSS file and runs all the inlining stuff.

The following Gulp task runs inliner() for each of the files, but it seems to be failing to inject the results back into the stream. I've tried a few different approaches, but I can't seem to get the results of inliner() back into the original stream.

gulp.task('inline', inline);

function inline() {
  return gulp.src('dist/**/*.html')
    .pipe(tap( (file, t) => {
      let fileName = path.basename(file.path);
      let cssPath = getStylesheetPathFromHtmlPath(fileName);
      return inliner(cssPath);
    }))
    .pipe(gulp.dest('dist'));
}

function inliner(cssPath) {
  var css = fs.readFileSync(cssPath).toString();
  var mqCss = siphon(css);
  var pipe = lazypipe()
    .pipe(inlineCss, {
      applyStyleTags: false,
      removeStyleTags: true,
      preserveMediaQueries: true,
      removeLinkTags: false
    })
    .pipe(replace, '<!-- <style> -->', `<style>${mqCss}</style>`)
    .pipe(replace, `<link rel="stylesheet" type="text/css" href="css/${getStylesheetNamespace(cssPath)}.css">`, '')
    .pipe($.htmlmin, {
      collapseWhitespace: true,
      minifyCSS: true
    });
  console.log(cssPath)
  return pipe();
}

Am I using gulp-tap incorrectly? This seems like a very simple use case.



via oatmealsnap

Ajax header causing 400 response when tested on node/Heroku, but works on localhost

This all works fine offline, but when I upload it to Heroku I keep getting 400 errors, and it never reaches any of the routes. It does get to the routes when I comment out the header. What's going on here? Any guesses?

I'm basically trying to send a JWT token when the page is loaded to see if the user needs to login again, or if I can just retrieve their info. Works fine tested on localhost with node.

        $.ajax({
            url: 'users/preauth',
            type: 'POST', 
            dataType: 'json',
            // contentType: "application/json ;charset=UTF-8",
            headers: {"Authorization": TOKEN},
        }).done( function(result){ 
        }).fail(function(err){ 
        })

Is it a CORS issue? And if it is, how do I set it up with my Express app to allow the authorization header?

Is there a security risk intrinsic to allowing headers, and if so, is there a better way to send the token without an authorization header?



via Luddens Desir

Creating Javascript Regex from C

Is it possible to create a RegExp javascript object, using C/C++? I'm just curious about it.

Example:

const myCLib = require('myCLib');
myCLib("/\\s/", "g"); => return regex object /\g/g

Thank you.



via Celo

How to access results of shopify-node-api request outside callback function

I'm building a Shopify app and I'm running into an issue while using the shopify-node-api module. This is the code I'm working with:

collectProducts: ['storedProducts', function(results, callback) {

  const collected_products = results.storedProducts;

  for (var i = 0; i < collected_products.length; i++) {

    Shopify.post('/admin/collects.json', {
      "collect": {
        "product_id": collected_products[i].product_id,
        "collection_id": process.env.DAILY_COLLECTION
      }
    }, function(err, data, headers){
      collected_products[i].collect_id = data.collect.id;
    });
  }

  callback(null, collected_products);
}],

For clarity's sake, the collectProducts item is part of an async function. I'm trying to gather the collect ID from the response to the post request and update the collect_id value in collected_products. The issue is I can't seem to access the collected_products array from inside the callback function for the post request. Is there a way to 1. simply return that value for each iteration of the for loop or 2. access the collected_products array from within that callback function to store those values?

Thanks in advance for any answers!



via paperbeatsscissors

run user defined function, if number of user connected through sockets is atleast one - Nodejs

I am trying to find a way to stop setInterval(test,5000) function from running if no user is connected to the socket stop setInterval function as it causes lot of waste of resources.

I found the method but I dont know how to put it

io.engine.clientsCount  //this will tell number of users connected but only inside socket.on function.

below is my code:

var connectCounter = 0;  


app.get('/', function(req, res){
  res.sendFile(__dirname + '/index.html');
});


 function test()
  {
    httpk.get("api-url", function(res) {
        var body = ''; 
        res.on('data', function(data){
            body += data;
        });

        res.on('end', function() {
            var parsed = JSON.parse(body);
            console.log(parsed.johndoe.example1);
            nsp.emit('live-quote', parseFloat(parsed.johndoe.example1);
        });
    });
  }

     setInterval(test,5000);

nsp.on('connection', function(socket){


  //Make a http call
  connectCounter++;
  nsp.emit('live-users',connectCounter);
  console.log('1 user connected, Total Joined: '+connectCounter);

  socket.on('disconnect', function(){
    connectCounter--;
    nsp.emit('live-users',connectCounter);
    console.log('1 user disconnected, Total Left: '+connectCounter);


  });

console.log("total clients: "+io.engine.clientsCount);

if(io.engine.clientsCount >= 1)
{
  //do something
  //if I put setInterval here it will cause problems, that is for each connection it will run setInterval causing lot of http get request
  // meaning, if 100 users then 100 get request in 5 seconds (depending on setInterval time).
}

});

How do I best stop execution of SetInterval(test,5000) if no users connected?



via Murlidhar Fichadia

Piping HTTP 'CONNECT' requests to another net.Socket and getting the response back

I'm trying to set up a reverse proxy server that takes HTTP(S) requests and relays the requests to an exit node. The exit node will give a response back to the master server and the master proxy should return the result to the client that initiated the request.

Like this:

[CLIENT] ---HTTP(S) REQ---> [MASTER PROXY] --- RELAY REQ ---> [EXIT NODE]
   ^                                                            |
    \--- RELAY RESP --- [MASTER PROXY] <--- HTTP(S) RESP ------/

Insecure HTTP is easy. I'm using Net.Socket.Write for comm between master proxy & exit node.

Secure HTTPS is difficult. I don't want to throw cert errors. For a standard reverse proxy you'd pipe the TCP connection like this:

var http = require('http'),
    net = require('net'),
    httpProxy = require('http-proxy'),
    url = require('url'),
    util = require('util');

var proxy = httpProxy.createServer();

var server = http.createServer(function (req, res) {
  util.puts('Receiving reverse proxy request for:' + req.url);

  proxy.web(req, res, {target: req.url, secure: false});
}).listen(8213);

server.on('connect', function (req, socket) {
  util.puts('Receiving reverse proxy request for:' + req.url);

  var serverUrl = url.parse('https://' + req.url);

  var srvSocket = net.connect(serverUrl.port, serverUrl.hostname, function() {
    socket.write('HTTP/1.1 200 Connection Established\r\n' +
    'Proxy-agent: Node-Proxy\r\n' +
    '\r\n');
    srvSocket.pipe(socket); // pipe the connection to allow handshake directly between server/client
    socket.pipe(srvSocket); // pipe the connection to allow handshake directly between server/client
  });
});

But I can't figure out how to pipe to the exit node and get a valid response in plaintext all the way back to the requesting client. Does anyone have any suggestions on how to achieve this?



via rdbell

What folder structure should I use?

I want to build a twitch chat bot with : tmi.js But i wonder what folder structure should i use.
And what are the conventions for this kind of app ?



via Olivier.B

Generate JSON type definition file with Tern.js's condense utility

I'm trying to use Tern's condense script to generate a type definition file for the p5.js Javascript library.

The Tern documentation says: "Pass --plugin name or --plugin name={jsonconfig} to load plugins. Use --def file to load JSON definitions."

I put the p5.js source code in the project's root and ran node condense --plugin ../p5/p5.js

I'm getting reference errors for all window, screen, and document-related functions, which makes me think the command is trying to run p5.js files, not analyze them.

Am I structuring the command incorrectly? Do I need a .tern-project file for this?



via Jen Kagan

Error: listen EADDRINUSE :::443 error when trying to start nodejs server

I am trying to create a nodejs application that uses SSL. I have the cert, domain, and nodejs configured and working, but I'm having a problem sometimes when I stop my node server using Ctrl+C. I am getting the following error when trying to restart the node server:

Error: listen EADDRINUSE :::443
    at Object.exports._errnoException (util.js:870:11)
    at exports._exceptionWithHostPort (util.js:893:20)
    at Server._listen2 (net.js:1237:14)
    at listen (net.js:1273:10)
    at Server.listen (net.js:1369:5)
    at Object.<anonymous> (/var/www/html/server.js:212:47)
    at Module._compile (module.js:410:26)
    at Object.Module._extensions..js (module.js:417:10)
    at Module.load (module.js:344:32)
    at Function.Module._load (module.js:301:12)

Now I know this means that the port is in use, but I've tried several things to find the process that is using that port but I cant. A couple thing I've tried that are worth mentioning are netstat:

$ sudo netstat -tulpn
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      1164/sshd       
tcp6       0      0 :::22                   :::*                    LISTEN      1164/sshd       
udp        0      0 0.0.0.0:68              0.0.0.0:*                           882/dhclient

and ps:

$ ps aux |grep node
ubuntu    7632  0.0  0.0  12944   988 pts/0    S+   23:30   0:00 grep --color=auto node

but neither of them show me what is using port 443. I have pm2 to keep my node server in a running state, but it keeps saying errored and then I find the EADDRINUSE errors in the pm2 error log. I've been banging my head against the wall for hours. Can anyone assist me? Thanks!



via Alan

Nightmare js .then() error

I am trying to work out an error in my code but can't seem to figure it out. I believe the error is with the then statement and it then throws out the error that the wait is not a function. I am trying to check if the name commit exists in the page. Thanks

nightmare
 .goto('http://www.google.com/')
 .evaluate(function () {
      return document.querySelectorAll('[data-style-name="' + "Black" + '"]')[0].click();
 })
 .wait(100)
 .exists('[name="commit"]')
 .then(function(result){
      if(result){
           console.log("This Exists");
      }
 })
 .wait(100)
 .catch(function (error) {
      console.error('Search failed:', error);
 });

The error it throws out is:

TypeError: nightmare.goto(...).evaluate(...).wait(...).exists(...).then(...).wait is not a function



via Noah Cover

how to handle NodeJS query response using javascript

router.post('/queryrule', function(req, res){
var requestID = req.body;

var frm_requestID = requestID['sbruleid'];
req.checkParams('frm_requestID', 'Not valid Rule ID!').isInt();

var errors = req.validationErrors();
if(errors){
        res.render('queryrule',{
                errors:errors
        });
} else {
  var query = { sbruleid: frm_requestID };
    Ids.searchids(query, function (err, id) {
            if (err) throw err;
            console.log("out: "+ id); //-> the id json query is correct
            res.send(ids); //-> this suppose to send data on my javascript
    });
}}); 

Hi thanks for reading and answering. I do not receive any data after my node js res.send data to my javascript.I have this nodejs query code and works fine until console.log("out: "+ ids), and below is the javascript that post the data and suppose to received the response but its not working.

$(document).ready( function() {
$('#btn_frm_search').click( function() {

    query = { 'success' : false, 'data' : "" };

    query = validateFilter();
    if ( !query.success ) {
        return false;
    }

    quertdata = query.data;
//alert("filter:  "+ quertdata['sbruleid']);
    $.post( '/users/queryidsrule' , quertdata , function(data) {
        console.log(data); // -> this suppose to receive data from nodejs but no data receive.
        createResultTable(data);
    });
});

});



via mr yhel

Node.js & Electron: Where to require several modules?

Requiring 5-10 modules in a render process window takes about 5 seconds to render that window, although the window is already visible. I understand that require is also synchronous.

Therefore, is there a way to:

  1. make this asynchronous, or
  2. require these modules globally for current and new windows, or
  3. call these via ipc from main.js if this is faster?, or
  4. make this faster more practically?


via user1679669

express session and passport: req.isAuthenticated() return false after login

I need to handle persistent session on an Angular app using express and passport on the backend. After a successful login, if I make an http call (using angular $http) to an express API which returns request.isAuthenticated(), it always returns false. This is not case when I login and make the http call to the API using Postman, in that case i got true.

This is my configuration on the server:

server.js

const
        express = require('express'),
        config = require("../config"),
        path = require('path'),
        bodyParser = require('body-parser'),
        cookiePraser = require('cookie-parser'),
        cors = require('cors'),
        winston = require("winston"),
        morgan = require("morgan"),
        mongoose = require("mongoose"),
        passport = require("passport"),
        session = require("express-session"),
        flash = require("connect-flash"),



let app = express(),
    server = require("http").Server(app),
    io = require("socket.io")(server);

const sessionKey = "mySessionKey";




/*
 * ---------------------------------------------------------------------------------------
 * app configuration
 * ---------------------------------------------------------------------------------------
 */

// Add headers
app.use(function (req, res, next) {

    // Website you wish to allow to connect
    res.setHeader('Access-Control-Allow-Origin', req.headers.origin);

    // Request methods you wish to allow
    res.setHeader('Access-Control-Allow-Methods', 'GET, POST, OPTIONS, PUT, PATCH, DELETE');

    // Request headers you wish to allow
    res.setHeader('Access-Control-Allow-Headers', "Content-Type,X-CSRF-Token, X-Requested-With, Accept, Accept-Version, Content-Length, Content-MD5,  Date, X-Api-Version, X-File-Name");

    // Set to true if you need the website to include cookies in the requests sent
    // to the API (e.g. in case you use sessions)
    res.setHeader('Access-Control-Allow-Credentials', true);

    // Pass to next layer of middleware
    next();
});



app.use(morgan("dev")); 
app.use(bodyParser.json({limit: "50mb"}));
app.use(cookiePraser(sessionKey));    
app.use(express.static("public"));



app.use(session({
    secret: sessionKey,
    resave: true,
    saveUninitialized: true,
    cookie: {
        secure: false,
        httpOnly: false
    }
}));
app.use(passport.initialize());
app.use(passport.session()); 

require("./passportConfig")(passport); // passport configuration


app.get("api/test", function(req, res){
    return json({isAuthenticated: req.isAuthenticated()});
})

// [..]

passportConfig.js

const   LocalStrategy   =   require("passport-local").Strategy,
        User            =   require("./models/User");



module.exports = function(passport) {


    // used to serialize the user for the session
    passport.serializeUser(function(user, done) {
        done(null, user.id);
    });

    // used to deserialize the user
    passport.deserializeUser(function(id, done) {
        User.findById(id, function(err, user) {
            done(err, user);
        });
    });





    passport.use('local-signup', new LocalStrategy({

            usernameField : 'email',
            passwordField : 'password',
            passReqToCallback : true // allows us to pass back the entire request to the callback
        },

        function(req, email, password, done) {

            // asynchronous
            // User.findOne wont fire unless data is sent back
            process.nextTick(function () {

                User.findOne({'local.email': email}, function (err, user) {
                    if (err)
                        return done(err);

                    // check to see if theres already a user with that email
                    if (user) {
                        return done(null, false, req.flash('signupMessage', 'That email is already taken.'));
                    } else {

                        // if there is no user with that email
                        // create the user
                        let newUser = new User();

                        // set the user's local credentials
                        newUser.local.email = email;
                        newUser.local.password = newUser.generateHash(password);

                        // save the user
                        newUser.save(function (err) {
                            if (err)
                                throw err;
                            return done(null, newUser);
                        });
                    }

                });

            });

        }
    ));







    passport.use('local-login', new LocalStrategy({
            usernameField : 'email',
            passwordField : 'password',
            passReqToCallback : true // allows us to pass back the entire request to the callback
        },
        function(req, email, password, done) { // callback with email and password from our form

            // find a user whose email is the same as the forms email
            // we are checking to see if the user trying to login already exists
            User.findOne({ 'local.email' :  email }, function(err, user) {
                // if there are any errors, return the error before anything else
                if (err)
                    return done(err);

                // if no user is found, return the message
                if (!user)
                    return done(null, false, req.flash('loginMessage', 'No user found.')); // req.flash is the way to set flashdata using connect-flash

                // if the user is found but the password is wrong
                if (!user.validPassword(password))
                    return done(null, false, req.flash('loginMessage', 'Oops! Wrong password.')); // create the loginMessage and save it to session as flashdata

                // all is well, return successful user
                return done(null, user);
            });

        }
    ));





};

After a login I can see the cookie connect.sid has been set, but then if I try to call "api/test" route using angular $http I get always false (it returns true if I use Postman). Any suggestion on how to fix this?



via revy

Failed to connect to mLab by mongo

I tried to connect to mLab on my terminal when I followed the instruction on the website This is the command that I typed

mongo ds151461.mlab.com:51461/simplelogin -u <dbuser> -p <dbpassword>

This is the result I got:

MongoDB shell version v3.4.4
connecting to: mongodb://ds151461.mlab.com:51461/simplelogin
MongoDB server version: 3.2.13
WARNING: shell and server versions do not match
2017-06-12T19:12:35.498-0400 E QUERY    [thread1] Error: Authentication failed. :
DB.prototype._authOrThrow@src/mongo/shell/db.js:1459:20
@(auth):6:1
@(auth):1:2
exception: login failed

Btw, when I run the local mongoDB, I use mongod to make it work and mongo does not work



via Zuoyang Ding

Protractor: text.indexOf(...).isDisplayed is not a function

I'm aware this has been asked on a few occasions and even though I've looked through those questions, I'm not really sure how to fix this.

I'm checking if text "EUR" is contained in a div called "currency". This was working for me previously but I've started using lint and I've been getting a lot of these kind of errors.

This is the error I'm getting Failed: text.indexOf(...).isDisplayed is not a function

This is my code

checkBuyerCurrency (text, buyerCurrency) {
    let currencyPromise = new Promise(function (resolve, reject) {
    const commonUtils = new CommonUtils();
    var EC = protractor.ExpectedConditions;
    browser.wait(EC.visibilityOf(element(by.className("currency")),     4000));
    var checkCurrency = element(by.className("balances"));
      checkCurrency.getText().then(function (text) {
           expect (text.indexOf("EUR").isDisplayed()).toBe(true);
           console.log("EUR only buyer");
      });
    });
  }

Do I need to make a text a variable or convert it to string? I'm not entirely sure how to do this due to the way I'm using the Expect statement

Thanks for any help



via Edmond

HTML form post to node.js file keeps loading

I'm trying to do a simple form submission on my server that posts some contact information. I receive the information successfully on my node.js file that runs the server, but after I click submit the page tries to load action page and eventually fails.

HTML

<form id = "form" action="/", method = "post">
....
<button type="submit" form="form" value="Submit">Submit</button>
</form>

Node.js

app.post('/', function(req, res) {
.......

I'm using express to display static html and image files, and i'm connecting to localhost:8080/MyWebsite.html. But after I click the submit button I'm redirected unsuccessfully to localhost:8080, where chrome says "this site cannot be reached". Can someone explain to me exactly what's going on here, and how I can simply submit the form and stay on the page without any other issues?



via Sherman Luo

amqp.node library, viewing error

I'm trying to use amqp.node library to connect to rabbit through SSL and according to the doc: http://www.squaremobius.net/amqp.node/ssl.html you should pass console.warn to the then callback. In the project I just started work on (my first node project), we are using winston logger. So how do I actually see the error because when I do,

var opts = { }; // my ssl info
amqplib.connect("ampws://{user}:{pass}@{host}:{port}", opts).then(function(err, conn) {
    if (err) {
        winstonLogger.error("err: " + err) // this just prints [object Object]
    }
}).then(null, console.warn);

I'm not sure how to map the console.warn to my actual logger.



via Crystal

MongoDB: mongod shows that my app is not authorized

I have installed mongodb and my mongodb and db folders are C:/mongoDB/bin and C:/data/db respectively. I have also setup admin user as stated on https://docs.mongodb.com/manual/tutorial/enable-authentication/

Now i want to perform basic CRUD operations requiring both read and write on a database mApp through Express and Mongoose. I am providing code for both app and schema below. Code is well documented so that it is easy to understand.

App.js

var express = require('express');
var app = express();

//Invoking user
var User = require('./schema.js');

//Creating an employee object by giving values to all properties
var User1 = new User({
  name: 'Anurag',
  username: 'Anurag2',
  password: 'abc',
  admin: false,
  location: 'somewhere',
  meta: {
    age: 25,
    website: 'abc.com'
  },
  createdAt: 'Jun 11 2017',
  updatedAt: 'Jun 11 2017'
}); //Remember to provide all records,otherwise document wont be saved.

//CRUD start. Creating a user document
User1.save(function(err, employ, num) {
  if (err) {
    console.log('error occured');
  }
  console.log('saved ' + num + ' record');
  console.log('Details ' + employ);
});

/*To retrieve documents from dtavabse,you can retieve all at
once, or one at a time by find(), findById(), findOne()*/

//To retrieve all documents
User.find({}, function(err, data) {
  if (err) {
    console.log('error occured while retrieving all docs');
  }
  console.log(data);
});

User.findOne({
  username: 'Anurag2'
}, function(err, data) {
  if (err) {
    console.log('error in finding one document');
  }
  console.log(data);
});

User.update({
  location: 'someplace'
}, {
  location: 'anything'
}, function(err) {
  if (err) {
    console.log('error in updating');
  }
  console.log('updated');
});

//update one document
User.findOneAndUpdate({
  username: 'Anurag2'
}, {
  admin: true
}, function(err, data) {
  if (err) {
    console.log('error in finding and updating');
  }
  console.log('updated' + data);
});

//Delete a user document
User.remove({
  location: 'anywhere'
}, function(err) {
  if (err) {
    console.log('error occured');
  }
  console.log('removed');
});

DB Schema(schema.js)

var mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/mApp'); //myApp is the database being connected here.


//now we open a connection
var db = mongoose.connection;
db.once('open', function() {
  console.log('Connected to Database');
});
db.on('error', console.error.bind(console, 'connection error'));

//initialising a schema
var Schema = mongoose.Schema;

mongoose.Promise = require('bluebird'); //used as mpromise was showing deprecated on console.


//creating a schema
var userSchema = new Schema({
  name: String,
  username: {
    type: String,
    required: true,
    unique: true
  },
  password: {
    type: String,
    Required: true
  },
  admin: Boolean,
  location: String,
  meta: {
    age: Number,
    website: String
  },
  createdAt: Date,
  updatedAt: Date
});

//creating a model that uses this schema
var User = mongoose.model('User', userSchema);

//now we export this model
module.exports = User;

Now I login in mongo through admin and i changed the db to mApp. I run the app through node The mongod console shows i am not authorized to perform any actions on the app.

No query gets executed and i receive all error messages. Why is this happening? Please help me on this.



via Anurag Sharma

How do I create a clickable link in common terminal emulators?

At my last workplace the senior dev had put together a big Nodejs build system with Grunt and Bower and all the buzzwords of the time. When running grunt-serve or grunt-watch, it would print a clickable link in the terminal emulator. This link worked in both Ubuntu 10.10(?) and Arch, but no other links I saw in the terminal were clickable, leading me to believe it was some Node sorcery.

How could this have been achieved?



via Strelok

Mocha cannot find file when test is run

Im trying to test my node endpoint that makes a call to another module. The utils module tries to read from a file. It works fine when I just call the endpoint but when the test runner tries to open it, it cannot find the JSON file.

app.js

const express = require('express')
const app = express()
const utils = require('./utils.js')

app.post('/', function (req, res) {      
  var state = utils.determineState();
  res.send(state);
})

utils.js

var fs = require('fs');

var determineState = function(long, lat){
    var lines = loadStatesFile();

    return lines[0];
};

function loadStatesFile(){
    var statesMap = {};
    var lines = require('fs').readFileSync("../../states.json", 'utf-8').split('\n');

    return lines;
};

module.exports = {
    determineState: determineState
};

appTest.js

describe('POST / ', () => {
        it('should response with 200', (done) => {
            chai.request(app).post('/')
            .end((err, res) => {
                expect(res.statusCode).to.equal(200);
                done();
            });
        });
    });
});

Whenever I run this test Im getting an error on the readFilesSync line. Error: ENOENT: no such file or directory, open '../../states.json'

Any ideas on how I can have the test find and open that file? It works fine when I run the app normally.



via Kierchon

aws-sdk signed request response throwing syntax error

I'm trying to integrate direct uploads for images to S3 in my Heroku deployed application and following this guide by Heroku — Direct to S3 File Uploads in Node.js.

I've followed all the instructions in the guide correctly. On uploading an image in my form, I even get a 200 response with a responseText which is supposed to be parsed to JSON. The problem is that I get a syntax error when I open Chrome's developer tools and it also pauses the execution in the debugger.

Uncaught SyntaxError: Unexpected end of JSON input
    at JSON.parse (<anonymous>)
    at XMLHttpRequest.xhr.onreadystatechange (photo-upload.js:19)

The following is logged in my console which looks like valid JSON and should be parsed without any issue (dummy value for access key and signature):

{"signedRequest":"https://getfitapp.s3.amazonaws.com/my-pic.jpg?AWSAccessKeyId=DHQTJ126JKQ9DZLODQ9C&Content-Type=image%2Fjpeg&Expires=1497301502&Signature=AjwDHv2PvLUItEgGWtF2G5In2l0%3D&x-amz-acl=public-read","url":"https://getfitapp.s3.amazonaws.com/my-pic.jpg"}

Also, for reference, this is the function used to get a signed request from AWS before uploading the file:

function getSignedRequest(file){
  const xhr = new XMLHttpRequest();
  xhr.open('GET', `/sign-s3?file-name=${file.name}&file-type=${file.type}`);
  xhr.onreadystatechange = () => {
    if(xhr.readyState === 4) {
      if(xhr.status === 200) {
        const response = JSON.parse(xhr.responseText);
        uploadFile(file, response.signedRequest, response.url);
      }
      else {
        alert('Could not get signed URL.');
      }
    }
  };
  xhr.send();
}

What might be the problem? Since the response is coming from AWS, I don't know what I can do to fix the problem.



via Daksh Shah

Node.js - Listenning for database changes

I'm developing an Android application that saves information inside a smartphone database and shows closed roads using Google Maps.

Besides that, I created a server that runs Node.js to get the data from a local MySQL database and sends the data using the JSON format.

My main objective now is to listen for Server database changes and if a change was made, send a token to the Android application, requesting to update the smartphone database.

I've tried to use ZongJi and mysql-events packages but none seems to work when I make a change inside the Server database. Another option I found is to use MySQL Triggers, but since I never worked with them it's still a bit confusing to me.

This is the code I used to test the mysql-events package:

var http = require('http');
var mysql = require('mysql');
var io = require('socket.io');
var MySQLEvents = require('mysql-events');

var connection = mysql.createConnection({
        host : 'localhost',
        user : 'root',
        password : '',
        database : 'closed_roads',
});

var dsn = {
  host: 'localhost',
  user: 'root',
  password: '',
};

var mysqlEventWatcher = MySQLEvents(dsn);

var watcher = mysqlEventWatcher.add(
  'closed_roads.localizacao',
  function (oldRow, newRow, event) {
     //row inserted 
    if (oldRow === null) {
      console.log(event);
    }
 
     //row deleted 
    if (newRow === null) {
       console.log(event);
    }
 
     //row updated 
    if (oldRow !== null && newRow !== null) {
      console.log(event); 
    }
 
  }, 
  'Active'
);

var server = http.createServer(function(request, response){ 
    
        connection.query('SELECT * FROM localizacao', function(err, results, fields) {
        
                console.log('Número de Registos: ' + results.length);
                
                response.writeHead(200, { 'Content-Type': 'application/json'});
                response.end(JSON.stringify(results));
                response.end();

    });
        
}).listen(3000);

var socket = io.listen(server);

socket.on('connection', function(client){ 
    
        console.log('Ligação estabelecida com o cliente!');

    client.on('message', function(event){ 
        console.log('Mensagem recebida do cliente:', event);
    });

    client.on('disconnect',function(){
        console.log('Ligação perdida!');
    });
});

console.log('Servidor correndo em http://127.0.0.1:3000/');

I would appreciate any help to find a solution to this problem.



via Ricardo Faria

module.export.variable is always an empty object if its reinitialized

I am confused a bit about the module.exports functionality in the below examples. In case 1, the a.arrayVar in b.js correctly gives all the elements that get added to the array in a.js even during the run time, but the same does not happen in case 2. The only difference in both the cases is the arrayVar is reinitialized in the function in case 2.I have a use case where the array must be reinitialized everytime it is updated dynamically and I am finding it difficult to implement this case. Any help in me understanding the concept in greatly appreciated.

case 1 :

// a.js
var arrayVar= [];
module.exports.arrayVar= arrayVar;

function test(element){
arrayVar.push(element);
}


// b.js
var a= require('./a.js');
console.log(a.arrayVar);

case 2 :

// a.js
var arrayVar= [];
module.exports.arrayVar= arrayVar;

function test(element){
 arrayVar= [];
 arrayVar.push(element);
}

// b.js
var a= require('./a.js');
console.log(a.arrayVar);



via KBSri

Electron package does not open in Ubuntu

I have made a simple application that gets the filenames from a directory using electron framework with javascript, html and CSS. Installed node.js and npm on my windows and created the package for linux and windows using electron pacakager now the application packaged for windows is running perfectly fine in windwos but the package i created for linux is not running in linux, i click on the file with the name of the project and nothing happens.

I Searched a lot on google but did not found anything helpful I don't know what maybe the issue, is there any program required to run it on linux.

Please help

following is my package.json file

{
  "name": "test",
  "productName": "test",
  "version": "1.0.0",
  "description": "Software",
  "main": "main.js",
  "devDependencies": {
    "electron": "^1.6.10",
    "electron-packager": "^8.7.1",
    "readdirp": "^2.1.0"
  },
  "scripts": {
    "start": "electron main.js",
    "package-mac": "electron-packager . --overwrite --  platform=darwin --arch=x64 --icon=assets/icons/mac/icon.icns --prune=true --out=release-builds",
    "package-win": "electron-packager . --overwrite --asar=true --platform=win32 --arch=ia32 --icon=assets/icons/win/icon.ico --prune=true --out=release-builds --version-string.CompanyName=CE --version-string.FileDescription=CE --version-string.ProductName=\"Electron Tutorial App\"",
    "package-linux": "electron-packager . --overwrite --platform=linux --arch=x64 --icon=assets/icons/png/1024x1024.png --prune=true --out=release-builds"
  },
  "author": AKB",
  "license": "ISC",
  "dependencies": {
  "readdirp": "^2.1.0"
   }
}



via Aakash Bhadana

Node.js proxy with nginx

This might be a similar question to this one (Node.JS proxy with Nginx and Docker) but with some small difference. I want to run nginx directly on host and proxy-pass to a docker container running Nodejs.

This is the configuration block.

    location /nodejs {
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header X-Forwarded-Proto $scheme;
            proxy_pass http://172.17.0.2:3000;
            proxy_http_version 1.1;
            proxy_set_header Upgrade $http_upgrade;
            proxy_set_header Connection 'upgrade';
            proxy_set_header Host $host;
            proxy_cache_bypass $http_upgrade;
    }

I have nodejs container running at 172.17.0.2 with expose port on 3000. But using address "host_ip/nodejs" doesn't seem to forward correctly. Because if I do host port mapping, then "host_ip:3000" redirects to a login page "./login" which is correct. But "host_ip/nodejs" redirects only to "/".

This is the test using curl.

$ curl host_ip:3000
Found. Redirecting to ./login/

$ curl host_ip/nodejs
See Other. Redirecting to /

If I want to let "host_ip/nodejs" have the same effect as "host_ip:3000", what I have to do?

Thanks.



via user180574

Reload PM2 configuration file

I have problems with reloading PM2 configuration file after editing it:

{
    "apps": [
        ...
        {
            "name": "foo",
            "script": "foo/index.js",
            "cwd": "foo",
            "watch": false
        }
    ]
}

I previously did

pm2 restart config.json

and

pm2 reload config.json

and

pm2 gracefulReload config.json

but they didn't reload the configuration for existing apps (the changes in app config did not apply). The only way that worked for me was:

pm2 delete foo
pm2 restart config.json

How is this supposed to be done?



via estus

How could I retrieve data from JSON objects to use it in .hbs templates

First, I've converted a CSV file to JSON using npm-csvtojson using the following code:

csv()
.fromFile(csvFilePath)
.on('json', (jsonObj) => {
// combine csv header row and csv line to a json object 
// jsonObj.a ==> 1 or 4
console.log(jsonObj);

.on('done',(error)=>{
console.log('end');
})

The console.log(jsonObj) result is:

{ FECHA: '2017-01-01 14:00:00', IDVARIABLE: '3', DATO: '123' }
{ FECHA: '2017-01-01 14:00:00', IDVARIABLE: '4', DATO: '52154' }
{ FECHA: '2017-01-01 14:00:00', IDVARIABLE: '7', DATO: '7.4' }
{ FECHA: '2017-01-01 14:00:00', IDVARIABLE: '11', DATO: '7.70' }

And then, I need to put this data in a .hbs template like this

<td style="border: 1px solid #000;">**</td>
<td style="border: 1px solid #000;">**</td>
<td style="border: 1px solid #000;">**</td>

How could I do this?



via Juan Pablo TM

Webpack / Node API Callback Throws Intermittent Client-Side Errors

I recently played with the callback function from the Webpack 2 Node API, in an attempt to log a custom message when compiling completed. https://webpack.js.org/api/node/

Doing so introduced strange, intermittent client-side errors. Webpack would appear to compile just fine; including firing off my callback without reporting any errors.

Then, loading the bundled files began throwing the following errors in the browser console (regardless of what code I put in the callback)

MOST COMMON:

Uncaught TypeError: Cannot read property 'NODE_ENV' of undefined

ALSO COMMON:

Uncaught TypeError: _invariant is not a function

It's important to note that this did not happen every time; only every 2, 3 times out of 5 that we restart the server. It does, however, only happen when a callback of any kind is passed to the webpack() function.

OUR CODE:

if (process.env.NODE_ENV === 'production') {
console.log('************** 📁  RUNNING IN PRODUCTION MODE 📁  **************');

  // SERVE THE STATIC FOLDER WHERE WEBPACK HAS BUILT OUR STUFF
  app.use('/static', express.static(path.join(__dirname, './CLIENTSIDE/static')));

} else {

  // ENABLE HOT RELOADING IN DEV MODE
  console.log('  🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧');
  console.log('  🔧🔧🔧🔧  RUNNING IN DEV MODE 🔧🔧🔧🔧');
  console.log('  🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧🔧 \n');
  const webpack = require('webpack');
  const webpackConfig = require('./webpack.config.dev');
  const compiler = webpack(webpackConfig, (err, stats) => {

    console.log('THIS CALLBACK BREAKS THINGS!');

  })

  app.use(require('webpack-dev-middleware')(compiler, {
    noInfo: true,
    publicPath: webpackConfig.output.publicPath
  }));

  app.use(require('webpack-hot-middleware')(compiler));

}

WEBPACK.CONFIG.DEV.JS:

const path = require('path');
const webpack = require('webpack');

module.exports = {
  devtool: 'cheap-eval-source-map',
  entry: {
    background: ['webpack-hot-middleware/client', path.join(__dirname, '/CLIENTSIDE/components/background')],
    uniqueShare: ['webpack-hot-middleware/client',  path.join(__dirname, '/CLIENTSIDE/components/uniqueShare')],
    starRating: ['webpack-hot-middleware/client', path.join(__dirname, '/CLIENTSIDE/components/starRating')],
    testingPage: ['webpack-hot-middleware/client', path.join(__dirname, '/CLIENTSIDE/components/testingPage')],
    style: ['webpack-hot-middleware/client', path.join(__dirname, '/CLIENTSIDE/components/style')],
    v2_style: ['webpack-hot-middleware/client', path.join(__dirname, '/CLIENTSIDE/components/v2-styles')]
  },
  output: {
    path: path.join(__dirname, '/CLIENTSIDE/static'),
    filename: '[name].js',
    publicPath: '/static/'
  },
  plugins: [
    new webpack.optimize.OccurrenceOrderPlugin(),
    new webpack.HotModuleReplacementPlugin(),
    new webpack.NoEmitOnErrorsPlugin()
  ],
  module: {
    loaders: [
        {
            test: /\.(js|jsx)$/,
            exclude: /(node_modules|bower_components)/,
            include: path.join(__dirname, './CLIENTSIDE/components'),
            loaders: ['imports-loader?define=>false', 'react-hot-loader', { loader: 'babel-loader', options: {
              cacheDirectory: true,
              presets: ['react', 'es2015', 'stage-0'],
              plugins: ['transform-decorators-legacy', 'transform-object-assign', 'array-includes']
            }}
          ]
        },
        {
            test: /\.scss$/,
            loaders: ['style-loader', 'css-loader', 'sass-loader']
        }
      ]
    }
  };

RELEVANT PACKAGE.JSON:

"babel-cli": "^6.24.1",
"babel-core": "^6.24.1",
"babel-loader": "^7.0.0",
"babel-plugin-array-includes": "^2.0.3",
"babel-plugin-transform-decorators-legacy": "^1.3.4",
"babel-plugin-transform-object-assign": "^6.22.0",
"babel-preset-es2015": "^6.24.1",
"babel-preset-react": "^6.24.1",
"babel-preset-react-hmre": "^1.1.1",
"babel-preset-stage-0": "^6.24.1",
"css-loader": "^0.28.4",
"node-sass": "^4.5.3",
"react-hot-loader": "^1.3.1",
"react-transform-catch-errors": "^1.0.2",
"react-transform-hmr": "^1.0.4",
"style-loader": "^0.18.1",    
"webpack": "^2.6.1",
"webpack-dev-middleware": "^1.10.2",
"webpack-hot-middleware": "^2.18.0"



via Zfalen

sequelize error model is not associated to another model

I want to include a BelongsTo association

I have 2 models

user , team team has many users, but user belongs to one team only, and team has a team leader

here is user model definition

module.exports = (sequelize, DataTypes) => {
   const user = sequelize.define('user', {
    username: {
    type: DataTypes.STRING,
    allowNull: false
  },
    password: {
    type: DataTypes.STRING,
    allowNull: false
  },
    team_id: {
    type: DataTypes.INTEGER,
    allowNull: false
  }
},
{
  classMethods: {
   associate: (models) => {
     user.belongsTo(models.team,{
        targetkey: 'id',
        foreignKey: 'team_id'
     });
   },
 },
});
return user;
};

here is team model definition

module.exports = (sequelize, DataTypes) => {
  const team = sequelize.define('team', {
   name: {
   type: DataTypes.STRING,
   allowNull: false
  },
   leader: {
   type: DataTypes.INTEGER,
   allowNull: false
  }
{
  classMethods: {
    associate: (models) => {
     team.hasMany(models.user, {
       foreignKey: 'id'
     });
     team.belongsTo(models.user,{
       targetkey: 'id',
       foreignKey: 'leader'
     });
   },
  },
 });
  return team;
 };

I want to get team information for each user when list users something like this

{
  "id": 3,
  "username": "user123",
  "password": "password",
  "team_id": 1,
  "team": {
       "name": "name",
       "leader": "leader_name"
   }
}

I tried to include team but the the response was empty

here is the the code:

list(req, res){
        return models.user.findAll({
            include: [{
              model: models.team
            }],
            offset: req.query.offset,
            limit: req.query.limit

        })
        .then(function(users){

        res.status(200).send(users)
        })
        .catch(function(error){
        res.status(400).send(error)
        })
    }

Response was empty

the output was {} error was team is not associated to user

Dialect: postgres

Database version: 9.6.3

Sequelize version: 3.30.4

I don't use aliases, I wonder what might be wrong in my definition

Advance very grateful for the help



via luna

Variable doesnt start from zero

I'm trying to paint on canvas using setInterval, but by looking the canvas it's starting from 1.06 and goes unbelievable fast. Issue what I would love to get solved: X Wont start from 1.06 it will start from 1.00 and go from there.

Here is the Fiddle: https://jsfiddle.net/r49naghf/

$(document).ready(function() {
    var canvas = $('#crashGraphic')[0];
    var graphStep = 0.1;
    var startX = 0;
    var context = canvas.getContext('2d');
    var canvasWidth;
    var canvasHeight;
    var scaleX = 30;
    var scaleY = 200;
    var border = 5;
    var drawX, drawY;

    function paintCrashGraphic(curentX, randomNumber, timeLeft) {
        canvasWidth = canvas.width;
        canvasHeight = canvas.height;
        context.lineWidth = 2;
        if ((border * 2 + curentX * scaleX) > canvasWidth) {
            scaleX = (canvasWidth - border * 2) / curentX;
        }
        if ((border * 2 + getCrashGraphicY(curentX) * scaleY) > canvasHeight) {
            scaleY = (canvasHeight - border * 2) / getCrashGraphicY(curentX);
        }
        context.strokeStyle = '#a5a5a5';
        context.clearRect(0, 0, canvasWidth, canvasHeight);
        context.beginPath();
        context.moveTo(border, canvasHeight - border);
        context.lineTo(border, border);
        context.moveTo(border, canvasHeight - border);
        context.lineTo(canvasWidth - border, canvasHeight - border);

        drawX = startX;
        var isFirst = true;
        context.stroke();
        context.beginPath();
        context.lineWidth = 3;

        context.strokeStyle = '#0d9b50';
        while (drawX <= curentX) {
            drawY = getCrashGraphicY(drawX);
            drawX += graphStep;
            if (isFirst) {
                isFirst = false;
                context.moveTo(border + drawX * scaleX, canvasHeight - border - (drawY - 1) * scaleY);
            } else {
                context.lineTo(border + drawX * scaleX, canvasHeight - border - (drawY - 1) * scaleY);
            }
        }
        if (timeLeft) {
            context.font = "100px Ubuntu-Regular,Helvetica,Arial,sans-serif";
            context.textAlign = "center";
            context.fillStyle = "#929292";
            drawString(context, 'Next round in \n ' + timeLeft, canvas.width / 2, canvas.height / 2, '#bf1c2d', '0', 'Ubuntu-Regular,Helvetica,Arial,sans-serif', '50');
        } else if (!randomNumber) {
            context.stroke();
            context.font = "100px Ubuntu-Regular,Helvetica,Arial,sans-serif";
            context.textAlign = "center";
            context.fillStyle = "#929292";
            context.fillText(drawY.toFixed(2) + 'x', canvas.width / 2, canvas.height / 2);
        } else if (randomNumber) {
            context.stroke();
            context.font = "50px Ubuntu-Regular,Helvetica,Arial,sans-serif";
            context.textAlign = "center";
            context.fillStyle = "red";
            drawString(context, 'Busted \n@ ' + parseFloat(randomNumber).toFixed(2) + 'x', canvas.width / 2, canvas.height / 2, '#bf1c2d', '0', 'Ubuntu-Regular,Helvetica,Arial,sans-serif', '50');
        }
    }

    function drawString(ctx, text, posX, posY, textColor, rotation, font, fontSize) {
        var lines = text.split("\n");
        if (!rotation) rotation = 0;
        if (!font) font = "'serif'";
        if (!fontSize) fontSize = 16;
        if (!textColor) textColor = '#000000';
        ctx.save();
        ctx.font = fontSize + "px " + font;
        ctx.fillStyle = textColor;
        ctx.translate(posX, posY);
        ctx.rotate(rotation * Math.PI / 180);
        for (var i = 0; i < lines.length; i++) {
            ctx.fillText(lines[i], 0, i * fontSize);
        }
        ctx.restore();
    }

    function getCrashGraphicY(x) {
        return Math.pow(1.06, x);
    }

    function getCrashGraphicX(y) {
        return Math.log(y) / Math.log(1.06);
    }
    paintCrashGraphic(startX);
    var timer;
    var x = 1;
    var speed = 1000/200;
    timer = setInterval(function(){
        x += 0.01;
      paintCrashGraphic(x);
    }, speed);


})



via Jordn

Having trouble providing an image to my report generation route

I am trying to add an image which is located at path in the code. I already have the app looking for static files there so I am not sure what my problem could be. I am using fluentreports.

Code:

var hh = function(Report) {
        //JSON.parse(req.body.facility).facilityID;
       
        Report.newLine(2);
        Report.image ( "../static/images/Logo.png" );
        Report.print('Device Scans', {fontBold: true, fontSize: 16, align: 'right'});
        Report.print('Client Name: ' + JSON.parse(req.body.client).clientName, {fontSize: 12, align: 'left'});
        Report.print('Facility Name: ' + JSON.parse(req.body.facility).name, {fontSize: 12, align: 'left'});
        Report.print('Facility Address: ' + JSON.parse(req.body.facility).address, {fontSize: 12, align: 'left'});
        Report.bandLine();
        Report.newline();


};


via Tzvetlin Velev

Issue finding mongoose ObjectID in array of strings representing ObjectIDs

I need to find the index of the mongoose objectID in an array like this:

[ { _id: 58676b0a27b3782b92066ab6, score: 0 },
  { _id: 58676aca27b3782b92066ab4, score: 3 },
  { _id: 58676aef27b3782b92066ab5, score: 0 }]

The model I am using to compare is a mongoose schema with the following data:

{_id: 5868d41d27b3782b92066ac5,
 updatedAt: 2017-01-01T21:38:30.070Z,
 createdAt: 2017-01-01T10:04:13.413Z,
 recurrence: 'once only',
 end: 2017-01-02T00:00:00.000Z,
 title: 'Go to bed without fuss / coming down',
 _user: 58676aca27b3782b92066ab4,
 __v: 0,
 includeInCalc: true,
 result: { money: 0, points: 4 },
 active: false,
 pocketmoney: 0,
 goals: [],
 pointsawarded: { poorly: 2, ok: 3, well: 4 },
 blankUser: false }

I am trying to find the index of the model._user in the array above using the following:

var isIndex = individualScores.map(function(is) {return is._id; }).indexOf(taskList[i]._user);

Where individualScores is the original array and taskList[i] is the task model. However, this always returns -1. It never finds the correct _id in the array.



via Ben Drury

configuration.entry error in grunt-webpack for webpack 2 with node 8

I'm trying to configure webpack2 with Grunt for running on node 8. I'm trying to hook up webpack with Grunt, but keep getting the config.entry error, inspite of providing all the config.

Warning:

 Warning: Invalid configuration object. Webpack has been initialised using a configuration object that does not match the API schema.
 - configuration.entry should be one of these:
   object { <key>: non-empty string | [non-empty string] } | non-empty string | [non-empty string] | function
   The entry point(s) of the compilation.
   Details:
    * configuration.entry should be an object.
    * configuration.entry should be a string.
    * configuration.entry[1] should be a string.
    * configuration.entry should be an instance of function
      function returning an entry object or a promise.. Used --force, continuing.

Please can someone check what is wrong with this configuration:

Grunt Snippet:

    webpack: {
        options: webpackConfig,

        prod: {
            devtool: '',    // Disable source maps
            plugins: webpackConfig.plugins.concat(
                new webpack.optimize.DedupePlugin(),
                new webpack.optimize.UglifyJsPlugin(),
                new webpack.DefinePlugin({
                    'process.env': {
                        'NODE_ENV': JSON.stringify('production')
                    }
                })
            )
        }
    },

    'webpack-dev-server': {

        options: webpackConfig,

        start: {
            port: 9002,
            keepAlive: true
        }

    },

Webpack config:

const ExtractTextPlugin = require('extract-text-webpack-plugin');
const path = require('path');
const webpack = require('webpack');

const SrcDir = path.resolve(__dirname, 'src');
const AppDir = path.resolve(SrcDir, 'application');

const BuildDir = path.resolve(__dirname, 'dist');
const AppBuildDir = path.resolve(BuildDir, 'application');

const ExtractSass = new ExtractTextPlugin({
filename: '[name].[contenthash].css',
disable: process.env.NODE_ENV === 'development' || 
  process.env.NODE_ENV === 'dev'
});

module.exports = {
cache: true,
context: __dirname,
entry: {
    app: path.join(AppDir, 'app.jsx'),
    vendor: [
        'react',
        'react-dom',
        'react-redux',
        'redux-dialog',
        'redux-thunk',
        'reselect'
    ]
},
output: {
    filename: '[name].js',
    path: AppBuildDir,
    libraryTarget: 'commonjs2',
    publicPath: './'
},
module: {
    rules: [{
        test: /\.scss$/,
        use: ExtractSass.extract({
            use: [{
                loader: 'css-loader'
            }, {
                loader: 'sass-loader'
            }],
            fallback: 'style-loader' // for dev env
        })
    }, {
        test: /\.jsx?$/,
        exclude: /(node_modules)/,
        use: {
            loader: 'babel-loader',
            options: {
                cacheDirectory: true,
                presets: [
                    ['env', {
                        targets: {
                            browsers: ['last 3 versions'],
                            node: '8.1.0'
                        }
                    }],
                    'react'
                ]
            }
        }
    }, {
        test: /\.spec\.js$/,
        exclude: /(node_modules)/,
        use: 'mocha-loader'
    }]
},
plugins: [
    ExtractSass,
    new webpack.optimize.CommonsChunkPlugin({
        name: 'vendor',
        filename: 'vendor.bundle.js'
    })
],
resolve: {
    modules: [
        AppDir,
        path.resolve(SrcDir, "server"),
        "node_modules"
    ],
    extensions: ['.js', '.jsx']
},
devtool: 'inline-source-map',
devServer: {
    hot: true,
    contentBase: BuildDir
}
};

These are the packages I am using:

"webpack": "^2.6.1",
"webpack-dev-server": "^2.4.5"
"grunt-webpack": "^3.0.0"



via poushy

Multiple API calls with Async & Request Package (NodeJS / Express)

I am trying to implement async and request into an asynchronous API request action within one of my routers. The end point is just a query string and I'm trying to return an an array or object with the result of all of the calls. So far there are just two:

router.get('/', function(req, res) {
    async.parallel([
        function(next) {
            request(queryString + 'end point string', function(error, response, body) {
                if (!error && response.statusCode == 200) {
                    var unsPAM = JSON.parse(body);
                };
                console.log(error);
            });
        },
        function(next) {
            request(queryString + 'end point string', function(error, response, body) {
                if (!error && response.statusCode == 200) {
                    var unsAll = JSON.parse(body);
                };
                console.log(error);
            });
        }], 
        function(err, results) {
            res.render("api-results", [results]);
        });
    });

This is the just of what I'm trying to do. If I console.log the result of each variable after the request it works properly, but nothing is being returned and my ejs template is not being served.

I have also tried using something like the below in various formats (array/object form) but I cannot seem to get it working:

res.render("api-results", { unsPAM: unsPAM, unsAll: unsAll });

I think its because the result of the request's aren't making it to the async array somehow. Any advice/best practice solutions/change of ideas would be greatly appreciated.

Thank you.



via Lukon

GraphQL arguments on Object in query

I want to execute a query like this:

{
  houses(owner: "Thomas") {
    id
    color
    cars(type: "Sports Car") {
      name
      year
    }
  }
}

But this returns an error:

"message": "Unknown argument \"type\" on field \"cars\" of type \"House\".",

However, I'm able to execute this properly:

cars(type: "Sports Car") {
  name
  year
}

Is what I'm trying to do even possible?

Thanks in advance!



via Thomas

AngularJS with NodeJS problems with body-parse

I just want to connect my AngularJS app with a RESTFUL API in node.js, but i have problems getting the post content with body-parse.

This is my angularJS request by $Http. "user" is a json like this

{email: "juanfran@test.com", psw: "Juanfran"}

 $scope.log = function(user){
    console.log(user);
    $http.post(APIURL + "/usuario/auth", user, {headers: {"Content-type": "application/x-www-form-urlencoded"}})
        .then(function successCallback(response) { 

                if(response.data[0] != null){
                    $localStorage.token = response.data;
                }
                else{
                    swal({   title: "Error",   text: response.data.Error,   type: "error",  confirmButtonText: "Ok" });
                }

        }, function(error){  swal({   title: "Error",   text: error.data.Error,   type: "error",  confirmButtonText: "Ok" });  });  

    return false;
} 

And this is my node.js configuration and route.

var methodOverride = require('method-override');

var app = express();

app.use(logger('dev'));
// parse application/x-www-form-urlencoded
app.use(bodyParser.urlencoded({ extended: true }));
app.use(express.static(path.join(__dirname, 'public')));

app.use(function(req, res, next){
    res.header("Access-Control-Allow-Origin", "*");
    res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
    res.setHeader('Access-Control-Allow-Methods', 'POST, GET, DELETE, PATCH, OPTIONS');
    next();
});

And how I get the form data:

router.post("/usuario/auth/", function(req, res){

    var userData = {
        email : req.body.email, 
        password : req.query.pwd,
    };
    console.log(userData);
    console.log(req.get("email"));
    UserModel.authUsuario(userData,function(error, data)
    {
        if(error == null && data && data.status === "OK")
        {
            res.status(200).json({"Success":"Correcto."});
        }
        else
        {
            res.status(200).json({"Error":error.Error});
        }
    });
});

And the exit...

{ email: undefined, password: undefined }
undefined
POST /usuario/auth 200 4.668 ms - 32

And there are the POST headers

Request URL:http://localhost:3000/usuario/auth
Request Method:POST
Status Code:200 OK
Remote Address:[::1]:3000
Referrer Policy:no-referrer-when-downgrade
**Response Headers**
view source
Access-Control-Allow-Headers:Origin, X-Requested-With, Content-Type, Accept
Access-Control-Allow-Methods:POST, GET, DELETE, PATCH, OPTIONS
Access-Control-Allow-Origin:*
Connection:keep-alive
Content-Length:32
Content-Type:application/json; charset=utf-8
Date:Mon, 12 Jun 2017 19:16:29 GMT
ETag:W/"20-VvY6+aRGMnbvwJDOEFg7V2+2T9E"
X-Powered-By:Express

Request Headers

Accept:application/json, text/plain, */*
Accept-Encoding:gzip, deflate, br
Accept-Language:es-ES,es;q=0.8
Cache-Control:no-cache
Connection:keep-alive
Content-Length:58
Content-type:application/x-www-form-urlencoded, application/x-www-form-urlencoded;charset=UTF-8;
Host:localhost:3000
Origin:http://localhost
Pragma:no-cache
Referer:http://localhost/juanfrantraining/
User-Agent:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36

I can not see what it's happening. Thanks!



via Hipólito Pérez

NodeJS express proxy - Getting a port number from params and supplying it in the proxy target?

Pretty strange one here.

I have url's with changing random endpoint ports e.g:

  • 0.0.0.0:1111
  • 0.0.0.0:2222
  • 0.0.0.0:3333

Now I'm contacting an API on these endpoints from a different endpoint (say 0.0.0.0:4444). Therefore I inevitably hit CORS issues from my JS client-side code.

Now, my client-side code actually knows the port. So I've tried to start implementing a NodeJS proxy which lets me get around this. Therefore I've decided to try and pass the port in as a URL parameter. So I can call 0.0.0.0:4444/api/[port].

My current attempt to do this is by using the npm package 'express-request-proxy'

const requestProxy = require('express-request-proxy');

app.all('/api/:port/*', requestProxy({
  url: 'http://0.0.0.0:<<PORT NEEDED HERE>>/*',
}));

I have tried the following:

url: 'http://0.0.0.0::port/*'

But this just returns the error:

Error: connect ECONNREFUSED 0.0.0.0:80<br> &nbsp; &nbsp;at Object.exports._errnoException (util.js:1014:11)<br> &nbsp; &nbsp;at exports._exceptionWithHostPort (util.js:1037:20)<br> &nbsp; &nbsp;at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1138:14)

I've successfully done this using NGINX with some config tweaks but I was hoping to move over to Node/Express.

Hoping someone can make sense of what I require and give me some help.

Thanks.



via Peza

Concurrent updating of a data value in redis + socket.io

I'm implementing an API using NODEJS, where multiple users can make the simultaneous request for updating a value stored in Redis DB. Since all the operation are atomic, implementing this won't be any issue. so far, so good.

But the scenario I am facing is that: I have a threshold value(say T1). And if there are multiple clients(say 10) make a request to update the SAME value at the SAME time( say, T1+2 value store the database before 10 concurrent requests). So I need to successfully update the value for first 2 users AND for the rest of 8 socket clients will be notified with the LOW Balance(or failed operations).

NOTE: the request made by every individual will decrement the value by 1.

I have searched extensively for a couple of hours.

But could not find any complete solution. Is there any callback parameter being called where we can check for MINIMUM THRESHOLD situation or some kind of success/fail promises.

Any help would be highly appreciated.



via Kunal Sharma

Java frontend login to Nodejs backend

I have created a basic NodeJS login and registration system. I will be building a full web application with Node and Socket.IO since I need to send data back and forth.

As well as the web client, I would like to use a Java client. My though was to use a URLConnection with a post in the same way a browser would login. I would then use the session to ensure the java client is connected to the nodejs server.

Am I going about this the right way? Would I be better off using a Java server?



via user2205189

Render raw image bytes to response body

I'm creating an API that creates authorized API calls to Google's APIs, specifically Drive for this question. My API is working fine and uses Google's Node API to make the requests. When I fire off a request to this resource, I get back the following response:

{
 "kind": "drive#file",
 "id": "...",
 "name": "bookmobile.jpg",
 "mimeType": "image/jpeg"
}

I use the above response to determine the MIME type of the file I'm to display later. I then make a subsequent call to the same endpoint, but specifying alt=media as an option to download the file as specified in Google's Guide. If I console.log or res.send() the response, I get the following output:

image bytes

Which we can see is the raw image bytes from the API call. How do I render these bytes to the response body properly? My code is as follows:

// DriveController.show
exports.show = async ({ query, params }, res) => {
  if (query.alt && query.alt.toLowerCase().trim() === 'media') {
    // Set to JSON as we need to get the content type of the resource
    query.alt = 'json'

    // Get the Files Resource object
    const options = createOptions(query, params.fileId)
    const filesResource = await Promise.fromCallback(cb => files.get(options, cb))

    // Grab the raw image bytes
    query.alt = 'media'
    await createAPIRequest(createOptions(query, params.fileId), 'get', res, filesResource)
  } else {
    await createAPIRequest(createOptions(query, params.fileId), 'get', res)
  }
}

async function createAPIRequest (options, method, res, filesResource = {}) {
  try {
    const response = await Promise.fromCallback(cb => files[method](options, cb))
    if (filesResource.hasOwnProperty('mimeType')) {
      // Render file resource to body here
    } else {
      res.json(response)
    }
  } catch (error) {
    res.json(error)
  }
}

Searching through various answers here all seem to point to the following:

res.type(filesResource.mimeType)
const image = Buffer.from(response, 'binary')
fs.createReadStream(image).pipe(res)

But this kills my Express app with the following error:

Error: Path must be a string without null bytes

How would I go about rendering those raw image bytes to the response body properly?



via Francisco Mateo

Cant reference passed value attribute in returned html code

I'm sure I'm doing something dumb but I don't see why this code wont work. It's react code that I'm compiling with webpack:

  var markers = this.state.assets;
      assets = assets.map(function(asseti,index){
        return(
          asseti.map(function(asset, index){
            return(
              <Marker position=[{asset.location.coordinates[0]},{asset.location.coordinates[1]}]>
                <Popup>
                  <span>A pretty CSS3 popup.<br/>Easily customizable.</span>
                </Popup>
              </Marker>

            )
          })
        )
      });

I get the error

JSX value should be either an expression or a quoted JSX text

  90 |           asseti.map(function(asset, index){
  91 |             return(
> 92 |               <Marker position=[{asset.location.coordinates[0]},{asset.location.coordinates[1]}]>
     |                                ^
  93 |                 <Popup>
  94 |                   <span>A pretty CSS3 popup.<br/>Easily customizable.</span>
  95 |                 </Popup>

Thanks, Ed.



via Ed Lynch

Mongoose Virtual field with async getter

I have a item model where it a virtual field to refer stock badges.

'use strict';

const mongoose = require('mongoose');
const mongooseHidden = require('mongoose-hidden')();
const Badge = mongoose.model('Badge');

const validateProperty = function(property) {
  return (property.length);
};

const Schema = mongoose.Schema;

const ItemSchema = new Schema({
  itemCode: {
    type: Number,
    index: {
      unique: true,
      sparse: true // For this to work on a previously indexed field, the index must be dropped & the application restarted.
    },
    required: true
  },
  itemName: {
    type: String,
    uppercase: true,
    trim: true
  },
  barcode: {
    type: String,
    trim: true
  },
  category: {
    type: Schema.Types.ObjectId,
    ref: 'Category'
  },
  subCategory: {
    type: Schema.Types.ObjectId,
    ref: 'SubCategory'
  },
  updated: {
    type: Date
  },
  created: {
    type: Date,
    default: Date.now
  },
  status: {
    type: String,
    enum: [
      'active', 'inactive', 'removed'
    ],
    default: 'active'
  }
}, {id: false});

ItemSchema.virtual('badges').get(function() {
  return this.getAvailableBadges();
});

ItemSchema.methods.getAvailableBadges = function() {
  Badge.find({
    item: this._id
  }, (err, badges) => {
    if (badges) {
      return badges;
    } else {
      return [];
    }
  });
};

ItemSchema.set('toJSON', {virtuals: true});
ItemSchema.set('toObject', {virtuals: true});

ItemSchema.plugin(mongooseHidden, {
  hidden: {
    _id: false,
    __v: true
  }
});

mongoose.model('Item', ItemSchema);

And batch model as below

'use strict';

const mongoose = require('mongoose');
const mongooseHidden = require('mongoose-hidden')();

const validateProperty = function(property) {
  return (property.length);
};

const Schema = mongoose.Schema;

const BadgeSchema = new Schema({
  item: {
    type: Schema.Types.ObjectId,
    ref: 'Item'
  },
  qty: {
    type: Number,
    validate: [validateProperty, 'Please enter Quantity !']
  },
  purchasingPrice: {
    type: Number,
    validate: [validateProperty, 'Please enter purchasingPrice !']
  },
  sellingPrice: {
    type: Number,
    validate: [validateProperty, 'Please enter sellingPrice !']
  },
  updated: {
    type: Date
  },
  created: {
    type: Date,
    default: Date.now
  },
  status: {
    type: String,
    enum: [
      'active', 'inactive', 'removed'
    ],
    default: 'active'
  }
});

BadgeSchema.plugin(mongooseHidden, {
  hidden: {
    _id: false,
    __v: true
  }
});

mongoose.model('Badge', BadgeSchema);

Item's badge virtual field doesn't got populated.

How are we going to work with async getter method

I have put some console log statements and found that getAvailableBadges is getting data.



via user2473015

Forwarding keyUp and change events from TinyMCE editor to TextArea element using Angular 4 Reactive Forms

I have an ReactiveForm in angular 4 in which I want to use TinyMCE to allow the users to format the text that would go inside and textarea.

I have created an component to host my TinyMCE editor, just like below (tinymce.component.ts):

import { Component, AfterViewInit, ViewChild, EventEmitter, forwardRef, ElementRef, OnDestroy, Input, Output } from '@angular/core';
import { 
ControlValueAccessor, 
NG_VALUE_ACCESSOR, 
NG_VALIDATORS, 
FormControl, 
Validator 
} from '@angular/forms';

@Component({
selector: 'tinymce',
templateUrl: './tinymce.component.html',
providers: [
{
  provide: NG_VALUE_ACCESSOR,
  useExisting: forwardRef(() => TinyMCEComponent),
  multi: true,
},
{
  provide: NG_VALIDATORS,
  useExisting: forwardRef(() => TinyMCEComponent),
  multi: true,
}]
})
export class TinyMCEComponent implements ControlValueAccessor, Validator, AfterViewInit, OnDestroy {
@Input() elementId: String;

constructor(private el: ElementRef) { }

editor;

content: string = 'weee';
private parseError: boolean;

// the method set in registerOnChange, it is just 
// a placeholder for a method that takes one parameter, 
// we use it to emit changes back to the form
private propagateChange = (_: any) => { };

// this is the initial value set to the component
public writeValue(obj: any) {
    if (obj) {
        this.content =  obj; 
    }
}

// registers 'fn' that will be fired when changes are made
// this is how we emit the changes back to the form
public registerOnChange(fn: any) {
    this.propagateChange = fn;
}

// not used, used for touch input
public registerOnTouched() { }

// change events from the textarea
private onChange(event) {
    $("#" + this.elementId).change();
    // update the form
    this.propagateChange(this.content);
}

// returns null when valid else the validation object 
// in this case we're checking if the json parsing has 
// passed or failed from the onChange method
public validate(c: FormControl) {
    return (!this.parseError) ? null : {
        jsonParseError: {
            valid: false,
        },
    };
}

ngAfterViewInit() {
    tinymce.init({
        selector: '#' + this.elementId,
        plugins: ['link', 'paste', 'table'],
        skin_url: '../assets/skins/lightgray',
        setup: editor => {
            this.editor = editor;
            editor.on('keyup', () => {
                this.content = editor.getContent();
                this.onChange(null);
            });
        }
    });
}

ngOnDestroy() {
    tinymce.remove(this.editor);
}
}

Then I have the component html (tinymce.component.html):

<textarea
#conteudo
id=""
[value]="content"
(change)="onChange($event)" 
(keyup)="onChange($event)">
</textarea>

And my form:

    <form [formGroup]="procedimentoForm" class="ui form">
      <div class="field">
    <label>Conteúdo</label>
    <tinymce formControlName="conteudo" [elementId]="'conteudo'"></tinymce>
  </div>

</form>

I would like that for any change in the TinyMCE editor (at keyup, for example), the formcontrol would get its value updated. The way it's working at the moment, only after I change the focus to another formcontrol the "parametroForm.value.conteudo" get updated in the form.



via Alaor