Medallia is the pioneer and market leader in Experience Management. Our award-winning SaaS platform, Medallia Experience Cloud, leads the market in the understanding and management of experience for candidates, customers, employees, patients, citizens and residents.
We are more than a software company. We want to be known as a company that does the right thing, no matter the challenge or controversy. We are committed to creating a culture that values every person and every experience. Individual life experiences shape the way we interact with the world, which is why we encourage people to bring their whole selves to work each day. The strength of our global workforce is the most significant contributor to our success.
We believe: Every Experience Matters. Talent is Everywhere. All Belong Here.
At Medallia, we hire the whole person.
For us, our Data Engineers are the master of deploying, automating, maintaining, troubleshooting, and improving the glue that keeps data flowing through our ecosystem; productionising the work our data scientists create and working with software engineers to ensure a robust end-to-end pipeline.
Our ideal candidate will be responsible for developing data pipelines (both batch and real-time) to ensure data flows smoothly from our edge to our core services (and between them too) and turn the extensive research prototype's our Data Scientist's develop in container based, scalable solutions running in Kubernetes. You would be expected to work closely with our DevOps Engineers to develop system automation and auto scaling design patterns.
· Deploying, automating, maintaining, and managing AWS cloud-based production systems.
· Build, release, and configuration management of production systems.
· System troubleshooting and problem solving across platform and application domains.
· Cloud: experience with using a broad range of cloud technologies, especially AWS (e.g. S3, Lambda, ECS, Glue, RDS, EC2 etc.)
· Software Engineering skills: clean code, unit testing, CI/CD, automation
· Programming languages (Python, Java etc.).
· Experience of containerisation using Docker
Nice to have knowledge and experience
· Databases: Relational, Non-relational, data warehousing
· Streaming data platforms (Kafka)
· Experience with the Hashicorp stack, Ansible, Jenkins & ELK stack
· Experience with Kubernetes is a plus, but not essential.
· Operating Systems: Linux system administration.
· Understanding network topologies and common network protocols/services
This is a technical role so to be successful you'll already have developed, maintained & improved data pipelines, created data models and storage solutions plus gained hands-on experience with public cloud infrastructure; we use AWS but any is fine. If you've had the chance to explore design/configuration with a security focus this definitely a plus, but not essential.
At Medallia, we celebrate diversity and recognize the value it brings to our customers and employees. Medallia is proud to be an equal opportunity workplace and is an affirmative action employer. Equal opportunity is afforded to all qualified applicants and employees. We do not discriminate on the basis of gender identity or expression, race, ethnicity, religion, national origin, age, sex, marital status, physical or mental disability, Veteran status, sexual orientation, and any other protected category. We also consider all qualified applicants regardless of criminal histories, consistent with legal requirements.
Medallia is committed to working with and providing reasonable accommodation to applicants with disabilities in accordance with the American Disabilities Act and local disability laws.
For information regarding how Medallia collects and uses personal information, please review our Privacy Policies