Kafkasse Example In Nodejs

Salman has 14 jobs listed on their profile. As with any message broker, you can. js is the rare example of a Linux-centric tool with great cross-platform support. Seneca is a microservices toolkit for Node. Apache Kafka Apache Spark JanusGraph KairosDB Presto Metabase Real-world examples E-Commerce App IoT Fleet Management Retail Analytics Work with GraphQL Hasura Prisma Explore sample applications Deploy Checklist Manual deployment 1. Integrating external services into an application is often challenging. Redis: Log Aggregation Capabilities and Performance Today, it's no question that we generate more logs than we ever have before. Command line parameters $. After some research I ordered the Nova SDS011 with USB to serial converter. 81K forks on GitHub has more adoption than RabbitMQ with 5. js except for user input validation. pointing to JDK root folder. For example, fully coordinated consumer groups – i. 0 of the DataStax Node. js Examples we shall go through in this Node. NET and Python, you can use SQS in your applications in a few lines of code. You can usually find me at the local pub, bouldering, or hunting for the best Korean fried chicken in London. User-Management APIs Users who have been assigned the Admin role for the cluster are able to create, edit, and remove users. Partitioning in Kafka Example Posted on 30th November 2016 30th November 2016 by admin DefaultPartitioner is good enough for most cases for sending messages to each partition on a round robin basis to balance out the load. The examples provided for producer and consumer are working fine. Since node. RethinkDB is the open-source, scalable database that makes building realtime apps dramatically easier. io library for websocket connection is very simple and it is a very powerful javascript library for realtime web applications. js server applications with Instana. There you have it: real time streaming data with Node. I would like to implement Live Objects MQTT interface for my app. Lambda supports standard data stream iterators and HTTP/2 stream consumers. With these capabilities, we can use Kafka in a various situation such as real-time streaming, stream storage, message queue etc. js, Scala, and Go, transporting tens of thousands messages every second. js in 2010, and, to date, has released almost 150 Node. Kubeless functions can also be registered to listen to PubSub events in a Kafka topic. I had a feeling that is what you may have tried hehe. The OpenCensus Kafka exporter allows you to export traces to Apache Kafka. com Configuration. Apache Kafka: A Distributed Streaming Platform. If you build Node. For example, you can use Lambda to thumbnail images, transcode videos, index files, process logs, validate content, and aggregate and filter data in real-time. Commonly you will find plain-text schemaless messages in for example JSON, or binary formats with an enforced schema such as AVRO. We’re going to see how to use this analytics service, new as of Couchbase 5. DevOps Linux. Now, since we know the basics, we can build more complex systems using the above tools. From #LNM how to get a GraphQL server running with serveral code examples----3). js library are asynchronous that is, non-blocking. 3 Technologies used. Use this tutorial to add yum repository and install Latest Nodejs to CentOS/RHEL 7/6 systems with the simple commands. js SDK for Couchbase. HEADS UP! This article was written for an older version of node. This two-part tutorial introduces Kafka, starting with how to install and run it in your development environment. py) and a consumer (consumer. Apache Kafka Setup on windows. JS application that publishes messages to a Kafka Topic (based on entries in a CSV file), how to create a simple Kafka Streams Java application that processes such messages from that TopicRead More. It was originally developed by LinkedIn, and is currently maintained by the Apache Software Foundation. It's a bit tricky to implement, but Firebase does a. In the web process the Source is Kafka and the Sink is a WebSocket that will push the random numbers to the browser. Publish only so far, for kafka_2. Although there are plenty of examples on the web with tutorials on how to implement API's using Node. In this post, we'll look at how to set up an Apache Kafka instance, create a user service to publish data to topics, and build a notification service to consume data from those topics. Kafka, in a nutshell, allows us to do three important tasks: Publish and subscribe to streams of data. It's a bit tricky to implement, but Firebase does a. Server Only. A worker process will send random numbers to Kafka so its Source will be periodically generated random numbers and its Sink will be Kafka. 8 and get a test broker up and running. To include the built-in Events module use the require() method. Sparks intention is to provide an alternative for Kotlin/Java developers that want to develop their web applications as expressive as possible and with minimal boilerplate. Kafka should be setup and running in your machine. sh --zookeeper localhost:2181 --topic test --from-beginning This is a message This is another message here This is the First Message I am sending. In the last two tutorial, we created simple Java example that creates a Kafka producer and a consumer. We can quickly get started with SQS by using the AWS SDKs, samples, and how-tos. Node-Red module for Apache Kafka publish/subscribe using the Confluent librdkafka C library. Different Node Js Modules. Unlike many other salary tools that require a critical mass of reported salaries for a given combination of job title, location and experience, the Dice model can make accurate predictions on even uncommon combinations of job factors. js Hosting Less Than $5 A Month On the Market? 97 · 33 comments Clean application architecture, load testing, and code review for a production Node. Apache Kafka Apache Spark JanusGraph KairosDB Presto Metabase Real-world examples E-Commerce App IoT Fleet Management Retail Analytics Work with GraphQL Hasura Prisma Explore sample applications Deploy Checklist Manual deployment 1. Features; Install Kafka; API. From here and here. /kafkaclient -h Usage of. By focusing on the key requirements of our scenario we were able to significantly reduce the complexity of the solution. js and hide some of the underlying HTTP requests from the user. A minimalistic Redis client using modern Node. Recently, while working on a similar kind of project we were in the same position and then we recognised the features of BCrypt module to hash passwords. Have a look at this article for more information about consumer groups. The reason for this is that it allows a small group of implementers who know the language of that client to quickly iterate on their code base on their own release cycle. In the last two tutorial, we created simple Java example that creates a Kafka producer and a consumer. IO to all connected clients so they can see the URL we just crawled. Third, you create Node. TrendingCourses. I've read several articles on internet and I come to a architecture like this: All chat will be 1 to 1. Follow the instructions on the Kafka wiki to build Kafka 0. if you still use the old consumer implementation, replace --bootstrap-server with --zookeeper. js provides a powerful API to interact with a Hyperledger Fabric blockchain. Publish only so far, for kafka_2. Luckily, the tooling to find bugs have improved a lot over time. That post focused on the motivation, low-level examples, and implementation of the REST Proxy. We typically set the name of a consumer to a string consisting of the hostname appended by a randomly generated UUID. Now, since we know the basics, we can build more complex systems using the above tools. The comments in the code explain how the code works and the text around it explain what it does and how to test it. These examples are extracted from open source projects. System configuration 2. To include the built-in Events module use the require() method. py) and a consumer (consumer. I have been given permission to repost the articles from the contest here (in wheat format) for general. js Developers. HEADS UP! This article was written for an older version of node. js are no exception to this. It supports industry standard protocols so users get the benefits of client choices across a broad range of languages and platforms. And finally, mongo-db defines our sink database, as well as the web-based mongoclient, which helps us to verify whether the sent data arrived correctly in the database. Our first step is to specify our build instructions in a Dockerfile. They run one Kafka process on each node (called broker), forming two separate clusters (one in eqiad and one in codfw) that are not aware of each other. In this article we've looked at Node. js with Kafka, I was unable to find many examples of their use together in tutorials, open source code on GitHub, or blog posts. js client with Zookeeper integration for Apache Kafka 0. js Hash Password using BCrypt. This way, you can position yourself in the best way to get hired. These examples are extracted from open source projects. Our goal is to collect. The reason for this is that it allows a small group of implementers who know the language of that client to quickly iterate on their code base on their own release cycle. I had a feeling that is what you may have tried hehe. AWS CodeBuild manages the following Docker images that are available in the CodeBuild and AWS CodePipeline consoles. js integration. For example, in a Node. Use this tutorial to add yum repository and install Latest Nodejs to CentOS/RHEL 7/6 systems with the simple commands. Lets look at how to debug nodejs applications. 9+ An Apache Kafka cluster. The element of array is passed as argument to the function during each iteration. js version: --inspect= for Node. 07/18/2017; 2 minutes to read +1; In this article. js has an event-driven architecture capable of asynchronous I/O that makes it lightweight and efficient. No need to worry about which database to use, how to structure your components, or how to manage dependencies. js with Kafka, I was unable to find many examples of their use together in tutorials, open source code on GitHub, or blog posts. Kafka producer client consists of the following APIâ s. by Saurabh Rayakwar NodeJS: Best Practices for Production This is an attempt to enlist the most important practices for developing and deploying on NodeJs. js Start by downloading the client-library for Node. Now taking not much of your precious time, lets quickly dive into understanding this journal entry "Nodejs Hash Password using BCrypt". This post is the second part in a series where we will build a real-time example for analysis and monitoring of Uber car GPS trip data. js 6 runtimes where Node. This file will need to reference the cassandra-driver file in the node_modules folder, so for easy access we saved my_node_example. Kafka-node is a Node. Speed: nvm lets you upgrade to a latest version of Node. Add the Confluent. if you're considering microservices, you have to give serious thought to how the different services will communicate. Node SQLite3 : This is a node. The most popular text formats are plain/flat text, JSON, and Apache Avro. Learn how to use the Apache Kafka Producer and Consumer APIs with Kafka on HDInsight. The stress testing ingests about 20,000 messages/sec with message sizes varying between 200bytes-1K into Kafka, where Spark Streaming is r… 1 node stream pipe file createreadstream yauzl writefilesync webpack unpipe typeerror. Node developers have a number of options for AMQP client libraries. Kafka is a distributed streaming platform whereas ZooKeeper is a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services. But if you created a new consumer or stream using Java API it. js example app teaches the very basics of how to work with Contentful: consume content from the Contentful Delivery and Preview APIs. System configuration 2. This blog covers real-time end-to-end integration with Kafka in Apache Spark's Structured Streaming, consuming messages from it, doing simple to complex windowing ETL, and pushing the desired output to various sinks such as memory, console, file, databases, and back to Kafka itself. When our Kafka cluster got bigger, and with our growing number of producers, we wanted to ensure that our data pipeline was fault tolerant. Stripe makes it easy for developers to accept. The following examples will help you understand how to use the different Apache Kafka components for Knative. In this article I describe how to install, configure and run a multi-broker Apache Kafka 0. The Project. IO, and Kafka. Getting this example; In this guide we’ll create a basic chat application. We will be creating a kafka producer and consumer in Nodejs. Besides CQL query executions, the drivers also support promises for metadata fetching methods, for example: const tableMetadata = await client. In this tutorial, we're gonna look at ways to implement this kind of Singleton in Node. js 6 being default pick by wsk CLI and Whisk Deploy. In a recent article I described how to implement a simple Node. Redis: Log Aggregation Capabilities and Performance Today, it's no question that we generate more logs than we ever have before. In this article we've looked at Node. 3 Technologies used. While OSX and Windows will treat 'myclass. Kafka Streams is the core API for stream processing on the JVM: Java, Scala, Clojure, etc. js application to the Azure platform, using Pino and Event Hubs to collect logs, and using Stream Analytics to gain real-time business intelligence. If you want to have kafka-docker automatically create topics in Kafka during creation, a KAFKA_CREATE_TOPICS environment variable can be added in docker-compose. Here is the producer command: echo "Hello, World" | ~/kafka. Since node. It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes with radius queries and streams. This leaves you free to focus on the real, business code. Please read the Kafka documentation thoroughly before starting an integration using Spark. Editor's Note: This is a 4-Part Series, see the previously published posts below: Part 1 - Spark Machine Learning. Features: High performance - confluent-kafka-dotnet is a lightweight wrapper around librdkafka, a finely tuned C client. NTO Sample App Part 2: Using Platform Events to Integrate Salesforce and Node. 5 and higher RedisKit RedisKit is a asynchronious client framework for Redis server, written in Objective-C. js是一個高效能的javascript執行環境,基於google的V8引擎和Evented IO。. Computations on streams can be. js client with Zookeeper integration for Apache Kafka 0. Kafka-node is a Node. Extend Stripe with Node. Kafka producer client consists of the following APIâ s. js on Ubuntu 16. js tech stacks, modern frameworks and tooling. Thông báo cho tôi bằng email khi có bài đăng mới. In this tutorial I will help you to build an application with Spark Streaming and Kafka Integration in a few simple steps. Kafka node can produce/consume the mesasges to/from kafka cluster along with topic(s) on NodeRED. Publish only so far, for kafka_2. Although Kafka was written in Java, Producers and Consumers can be written in different languages. Some answers appears to be like given just for the sake of answering. js with Kafka, I was unable to find many examples of their use together in tutorials, open source code on GitHub, or blog posts. Kafka in 30 seconds. I have been given permission to repost the articles from the contest here (in wheat format) for general. I am trying to understand how to communicate with a gRPC server. js Collector. You expose a function (ignoring that Java makes you wrap functions in ugly classes) that's subscribed to a Kafka topic that Samza calls whenever there's a new message. What made it worse was that I did not then, and do not now, believe that Kafka was the correct solution for that particular problem at that particular company. MongoDB as a Kafka Consumer: a Java Example. The main focus of these releases was to add support for speculative query executions. The Kafka Consumer API allows applications to read streams of data from the cluster. Couchbase Lite 2. Kafka-node is a Node. There are many Kafka clients for C#, a list of some recommended options can be found here. I'm currently struck with the fetching the data from Kafka and I'm using the Kafka-node library for node. js Event Loop, Timers, and process. Example implementation. 0 which means scala version as 2. js run-time. DevOps Services. A few months ago I began working on one of our backend Node. js library are asynchronous that is, non-blocking. First thing that you have to do is connect to the Kafka server. kafka-python is best used with newer brokers (0. This post explores the State Processor API, introduced with Flink 1. In their own words , "We've set out on a mission, a quest if you will, to gather together the best small to medium applications in each class, and try to bolt them together in such a way that combined they form a powerful, user-friendly. Apache Kafka is a distributed streaming platform. In this tutorial, we are going to create simple Java example that creates a Kafka producer. js integration. Kafka bean names depend on the exact Kafka version you’re running. IO to all connected clients so they can see the URL we just crawled. js version earlier than 8. To get started we created a simple NodeJS app with Express. Kafka Metrics Kafka in Docker Node. js example app. Azure HDInsight Modules for Node. A worker process will send random numbers to Kafka so its Source will be periodically generated random numbers and its Sink will be Kafka. js to create a fast, reliable, and scalable data processing pipeline over a stream of events. Kafka, in a nutshell, allows us to do three important tasks: Publish and subscribe to streams of data. From Kafka Streams in Action by Bill Bejeck. 8 release we are maintaining all but the jvm client external to the main code base. JS for interacting with Apache Kafka, I have described how to create a Node. In this post, we'll look at how to set up an Apache Kafka instance, create a user service to publish data to topics, and build a notification service to consume data from those topics. This section gives a few examples of invoking nodejs functions with HTTP requests and how HTTP request components can be extracted inside the function. In this article, we are going to set up multi-broker Kafka in. One interesting aspect of Kafka is that it can store and replay all the Messages received in order. This is a famous mathematical number sequence, where the next fibonacci number is equal to the sum of the previous two. npm install node-red-contrib-kafka-node. js into node_modules. js applications. Kafka Streams is the core API for stream processing on the JVM: Java, Scala, Clojure, etc. getTable( 'ks1' , 'table1' ); Promises are created in the drivers using the Promise constructor by default. js given at the GfK Nurago and SirValUse Academy 2013 in Hamburg. The default debugger port is 5858. In this example we'll be using Confluent's kafka-dotnet client. Given that Apache NiFi's job is to bring data from wherever it is, to wherever it needs to be, it makes sense that a common use case is to bring data to and from Kafka. 0 which means scala version as 2. js Hash Password using BCrypt. It seems I have to load any. Next up: scala. If you wish to learn more about NodeJS runtime along with the libraries that are supported or "built-in" by default, please visit NodeJS Runtime GitHub Repository. Installation. See the complete profile on LinkedIn and discover Salman’s connections and jobs at similar companies. Please be sure about the answers too. We can quickly get started with SQS by using the AWS SDKs, samples, and how-tos. js applications. js client for Apache Kafka 0. Matteo has nine years’ experience in the software industry and is a well-known figure in the global Node. class" appropriately. js back-end application, which enables you to care only about the business logic. For example, a consumer on host “application01” might be named “application01-3321631a-2476-11e7-93ae-92361f002671. If you use kafka-console-consumer. Kafka Streams is the core API for stream processing on the JVM: Java, Scala, Clojure, etc. Related posts: - How to build NodeJS/Express Bootstrap views - NodeJS/Express - save form data to MySQL using Sequelize ORM - Ajax JQuery + Bootstrap view ContentsGoalPracticeSetting up NodeJS/Express projectImplement Bootstrap viewsImplement JQuery Ajax requestsImplement. Plenty of examples such as express for node, jade and stylus. Below class determines the partitioning in the topic where the message needs to be sent. 3 Technologies used. Apache Kafka: A Distributed Streaming Platform. Install the prerequisites for rabbit. js You run the code by typing the following in terminal:. 509 certificate checking in just…. js collector is an npm package that you add to the dependencies of your Node. I re-wrote a Kafka producer node in node-red using nodejs. @pulumi/kafka; Package @pulumi/kafka. 07/18/2017; 2 minutes to read +1; In this article. js with Kafka and GraphQL. I had always heard that dealing with HTTP requests with the native Node. Azure HDInsight is a cloud distribution of the Hadoop components from the Hortonworks Data Platform (HDP). One response to “How to use AWS Simple Email Service (SES) from TypeScript on NodeJS example” Pingback: A list of Node. Kafka Streams Demo Application¶. Speed: nvm lets you upgrade to a latest version of Node. Read more. js Spotguide does for us. Refer to the LAM and Integration Reference to see the integration's default properties. An example always helps – the below example reads from alphabet. > > Be sure to replace [hostname] with the Kafka server number. 0 represents 5 years of open source evolution, and not a little blood, sweat and tears. What does Samza-like node. For example, a consumer on host “application01” might be named “application01-3321631a-2476-11e7-93ae-92361f002671. js version: --inspect= for Node. While these examples give you a rough idea of the usage, there are more real world examples here. Cloudera delivers an Enterprise Data Cloud for any data, anywhere, from the Edge to AI. If you are looking for a similar demo application written with KSQL queries, check out the separate page on the KSQL music demo walk-thru. For alternatives, please see the Advanced Options section. network >Value name: kafka_network_$1_$2. Previously, I posted about the Kafka REST Proxy from Confluent, which provides easy access to a Kafka cluster from any language. Apache Kafka, originally developed at LinkedIn, has emerged as one of these key new technologies. Running and Debugging Node. We already discussed the first line of this definition: "Node. How does Kafka work?. js is a JavaScript runtime environment that executes JavaScript code outside of a browser. Apache Kafka Apache Spark JanusGraph KairosDB Presto Metabase Real-world examples E-Commerce App IoT Fleet Management Retail Analytics Work with GraphQL Hasura Prisma Explore sample applications Deploy Checklist Manual deployment 1. ZeroMQ (also spelled ØMQ, 0MQ or ZMQ) is a high-performance asynchronous messaging library, aimed at use in distributed or concurrent applications. Here is an example snippet from docker-compose. Get an ad-free experience with special benefits, and directly support Reddit. This was a basic example to show how can we integrate Spark Streaming, Kafka, Node. In this post, well walk you through installing Apache Kafka and Apache Zookeeper with Kubernetes Helm. js version 8 people are starting to promisify these asynchronous functions and using await inside of a try/catch. $ npm install zeromq Examples using zeromq Push/Pull. Learn Python, JavaScript, DevOps, Linux and more with eBooks, videos and courses. js - Setup with example. In this tutorial, we are going to create simple Java example that creates a Kafka producer. Node SQLite3 : This is a node. NET Client for Apache KafkaTM. This SDK is compatible with Apache Kafka 0. If you want to have kafka-docker automatically create topics in Kafka during creation, a KAFKA_CREATE_TOPICS environment variable can be added in docker-compose. Using socket. If you have not yet created the base application, please head back and read the original tutorial. MongoDB as a Kafka Consumer: a Java Example. Specify the language of your application (available versions are listed above):. if you still use the old consumer implementation, replace --bootstrap-server with --zookeeper. cluster >Value name: kafka_cluster_$1_$2 labels: topic: "$3" partition: "$4" - pattern : kafka. Step 1 — Creating a Basic Node/Express App. With these capabilities, we can use Kafka in a various situation such as real-time streaming, stream storage, message queue etc. NET and Python, you can use SQS in your applications in a few lines of code. js in 2010, and, to date, has released almost 150 Node. JS for interacting with Apache Kafka, I have described how to create a Node. In this tutorial, we are going to create simple Java example that creates a Kafka producer. The application in question was not as stable as we needed it to be, and as I began to talk with former colleagues at another company I found that this is a common experience with Node. js' package ecosystem, npm, is the largest ecosystem of open source libraries in the world. js applications. While OSX and Windows will treat 'myclass. NTO Sample App Part 2: Using Platform Events to Integrate Salesforce and Node. The second example is the consumer API. js is basically a good fit for background processing and there are many ways of doing it. Use this tutorial to add yum repository and install Latest Nodejs to CentOS/RHEL 7/6 systems with the simple commands. js, MongoDB as our database, and Expess. Save and close the file once the necessary changes have been made. The recommended way to connect to postgres is via the client node-postgres. It is useful for connections with remote locations where a small code footprint is required and/or network bandwidth is at a premium. Computations on streams can be. I am working as a Node. Using Apache Kafka as robust, distributed, real-time, high volume event bus, this session demonstrates how microservices implemented in Java, Node, Python and SQL collaborate unknowingly. In this tutorial, we're gonna look at ways to implement this kind of Singleton in Node. NET client for Apache Kafka and the Confluent Platform. Benefit from the many improvements and advantages Angular offers you: Speed, ease of development, highly reactive, awesome support for asynchronous. Kafka Metrics Kafka in Docker Node. Kafka Tutorial: Writing a Kafka Producer in Java. Packt is the online library and learning platform for professional developers. Spring Boot + Kafka + Zookeeper NodeJS + Angular2 + MongoDB. 509 certificate checking in just…. Hello Console. Start YB-Masters 4. To boost stability and security minor updates come out every few weeks. The microservices respond to social (media) events - courtesy of IFTTT - and publish results to multiple channels. The fact-checkers, whose work is more and more important for those who prefer facts over lies, police the line between fact and falsehood on a day-to-day basis, and do a great job. Today, my small contribution is to pass along a very good overview that reflects on one of Trump’s favorite overarching falsehoods. Namely: Trump describes an America in which everything was going down the tubes under  Obama, which is why we needed Trump to make America great again. And he claims that this project has come to fruition, with America setting records for prosperity under his leadership and guidance. “Obama bad; Trump good” is pretty much his analysis in all areas and measurement of U.S. activity, especially economically. Even if this were true, it would reflect poorly on Trump’s character, but it has the added problem of being false, a big lie made up of many small ones. Personally, I don’t assume that all economic measurements directly reflect the leadership of whoever occupies the Oval Office, nor am I smart enough to figure out what causes what in the economy. But the idea that presidents get the credit or the blame for the economy during their tenure is a political fact of life. Trump, in his adorable, immodest mendacity, not only claims credit for everything good that happens in the economy, but tells people, literally and specifically, that they have to vote for him even if they hate him, because without his guidance, their 401(k) accounts “will go down the tubes.” That would be offensive even if it were true, but it is utterly false. The stock market has been on a 10-year run of steady gains that began in 2009, the year Barack Obama was inaugurated. But why would anyone care about that? It’s only an unarguable, stubborn fact. Still, speaking of facts, there are so many measurements and indicators of how the economy is doing, that those not committed to an honest investigation can find evidence for whatever they want to believe. Trump and his most committed followers want to believe that everything was terrible under Barack Obama and great under Trump. That’s baloney. Anyone who believes that believes something false. And a series of charts and graphs published Monday in the Washington Post and explained by Economics Correspondent Heather Long provides the data that tells the tale. The details are complicated. Click through to the link above and you’ll learn much. But the overview is pretty simply this: The U.S. economy had a major meltdown in the last year of the George W. Bush presidency. Again, I’m not smart enough to know how much of this was Bush’s “fault.” But he had been in office for six years when the trouble started. So, if it’s ever reasonable to hold a president accountable for the performance of the economy, the timeline is bad for Bush. GDP growth went negative. Job growth fell sharply and then went negative. Median household income shrank. The Dow Jones Industrial Average dropped by more than 5,000 points! U.S. manufacturing output plunged, as did average home values, as did average hourly wages, as did measures of consumer confidence and most other indicators of economic health. (Backup for that is contained in the Post piece I linked to above.) Barack Obama inherited that mess of falling numbers, which continued during his first year in office, 2009, as he put in place policies designed to turn it around. By 2010, Obama’s second year, pretty much all of the negative numbers had turned positive. By the time Obama was up for reelection in 2012, all of them were headed in the right direction, which is certainly among the reasons voters gave him a second term by a solid (not landslide) margin. Basically, all of those good numbers continued throughout the second Obama term. The U.S. GDP, probably the single best measure of how the economy is doing, grew by 2.9 percent in 2015, which was Obama’s seventh year in office and was the best GDP growth number since before the crash of the late Bush years. GDP growth slowed to 1.6 percent in 2016, which may have been among the indicators that supported Trump’s campaign-year argument that everything was going to hell and only he could fix it. During the first year of Trump, GDP growth grew to 2.4 percent, which is decent but not great and anyway, a reasonable person would acknowledge that — to the degree that economic performance is to the credit or blame of the president — the performance in the first year of a new president is a mixture of the old and new policies. In Trump’s second year, 2018, the GDP grew 2.9 percent, equaling Obama’s best year, and so far in 2019, the growth rate has fallen to 2.1 percent, a mediocre number and a decline for which Trump presumably accepts no responsibility and blames either Nancy Pelosi, Ilhan Omar or, if he can swing it, Barack Obama. I suppose it’s natural for a president to want to take credit for everything good that happens on his (or someday her) watch, but not the blame for anything bad. Trump is more blatant about this than most. If we judge by his bad but remarkably steady approval ratings (today, according to the average maintained by 538.com, it’s 41.9 approval/ 53.7 disapproval) the pretty-good economy is not winning him new supporters, nor is his constant exaggeration of his accomplishments costing him many old ones). I already offered it above, but the full Washington Post workup of these numbers, and commentary/explanation by economics correspondent Heather Long, are here. On a related matter, if you care about what used to be called fiscal conservatism, which is the belief that federal debt and deficit matter, here’s a New York Times analysis, based on Congressional Budget Office data, suggesting that the annual budget deficit (that’s the amount the government borrows every year reflecting that amount by which federal spending exceeds revenues) which fell steadily during the Obama years, from a peak of $1.4 trillion at the beginning of the Obama administration, to $585 billion in 2016 (Obama’s last year in office), will be back up to $960 billion this fiscal year, and back over $1 trillion in 2020. (Here’s the New York Times piece detailing those numbers.) Trump is currently floating various tax cuts for the rich and the poor that will presumably worsen those projections, if passed. As the Times piece reported: