NewsWhip is the leading provider of predictive media intelligence, tracking and predicting engagement with the world’s news stories each day.
Our platform is used each day by journalists at hundreds of top tier publications, and communications professionals at all 10 of the top 10 global PR agencies, dozens of international brands, and NGOs, international organisations and not-for-profits.
As an Irish company, our product and engineering functions are central to our business. This presents a great opportunity to directly input into what we build and how we build it. There is a direct link between your work, our customers and the continued success of NewsWhip.
As a Backend Engineer on our Data Platform team you will charged with the development, expansion and maintenance of the systems that extract, enrich, persist and ultimately expose the substantial volumes of social data as ingested by the Sources Team to enable impactful and actionable insights.
- Build and maintain the infrastructure required for optimal extraction, transformation and loading of large data sets
- Support and expand various services in a high-volume, event based distributed system
- Develop and support endpoints exposing feature value from our Data Warehouse to our customer-facing Application Development teams and internal Analysts
- Interface with other internal stakeholders - from both technical and non-technical teams (Engineering, PM, DataScience)
- 3+ years in software development
- Deep experience of at least one modern programming language such as Java, Scala, Go, Python etc.
- Knowledge of software design principles and leading software development practices
- Strong communication & collaboration skills
- Willingness to get things done, learn new things, take initiative and challenge existing assumptions and conventions
- Experience working with storage systems such as Elasticsearch, Cassandra, Kafka and MySQL
- Experience with high volume event streaming/queueing systems (Kafka, Kinesis, RabbitMQ etc.)
- Experience building and deploying to any cloud service (GCP, AWS, Azure, etc.)
Nice to have:
- Experience with Timeseries datastores implemented in NoSQL (Cassandra, DynamoDB, MongoDB, etc.) or Dedicated DBs (Druid, InfluxDB, etc.)
- Working with the Lightbend Reactive Platform of Scala and Akka
- Distributed database/data processing technologies (Spark, Presto etc.)
- Experience working with schedulers in a distributed, service-oriented environment (Airflow, Step Functions, Argo etc.)
- Experience with observability principles (Instrumentation, Tracing, Telemetry)
- Experience in building real-time integrations with a variety of external APIs, connectors and services (Restful/Streaming integrations, disparate authentication mechanisms, rate limit considerations, etc.)
- Experience with IaC and DevOps methodologies
- Knowledge of basic Linux administration, Kubernetes and Google Cloud Platform
- Experience working in an agile environment with iterative development and fast feedback
- Experience with Graph Databases (neo4j, ArangoDB etc.)
- Experience with Natural Language Processing - Implementation of provided models for Named Entity/Semantic Extraction and Linking, Sentiment Analysis, Content Classification, etc.
- Competitive salary
- Health insurance
- Bonus for individual and team performance
- Great working environment - remote first or hybrid model
- An opportunity to help define an entirely new industry category
- Excellent opportunity to grow in one of Dublin’s fastest growing home-grown companies
We believe in maintaining a friendly work environment, a healthy work-life balance, and compensating our employees fairly for their input. You’ll be part of a team that believes in mutual support and education, and for a company where a work week isn’t just the gap between weekends, but an opportunity to do work that is impactful and innovative. We also love eating and socializing together, annual and seasonal company and team retreats, healthy and unhealthy snacks and other perks.