Data Engineer - Marketing

Data Engineer - Marketing
Empresa:

Kimberly-Clark


Detalles de la oferta

Description Your Job You're not the person who will settle for just any role.
Neither are we.
Because we're out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference.
Here, you'll bring your professional expertise, talent, and drive to building and managing our portfolio of iconic, ground-breaking brands.
In your role, you'll help us deliver better care for billions of people around the world.
It starts with YOU.About UsHuggies®.
Kleenex®.
Cottonelle®.
Scott®.
Kotex®.
Poise®.
Depend®.
Kimberly-Clark Professional®.
You already know our legendary brands—and so does the rest of the world.
In fact, millions of people use Kimberly-Clark products every day.
We know these amazing Kimberly-Clark products wouldn't exist without talented professionals, like you.At Kimberly-Clark, you'll be part of the best team committed to driving innovation, growth and impact.
We're founded on 150 years of market leadership, and we're always looking for new and better ways to perform – so there's your open door of opportunity.
It's all here for you at Kimberly-Clark; you just need to log on!Led by Purpose.
Driven by You.About YouYou're driven to perform at the highest level possible, and you appreciate a performance culture fueled by authentic caring.
You want to be part of a company actively dedicated to sustainability, inclusion, wellbeing, and career development.You love what you do, especially when the work you do makes a difference.
At Kimberly-Clark, we're constantly exploring new ideas on how, when, and where we can best achieve results.
When you join our team, you'll experience Flex That Works: flexible (hybrid) work arrangements that empower you to have purposeful time in the office and partner with your leader to make flexibility work for both you and the business.Main responsibilities:Design and operationalize enterprise data solutions on Cloud Platforms: Develop and implement scalable and secure data solutions on cloud platforms, ensuring they meet enterprise standards and requirements.
This includes designing data architecture, selecting appropriate cloud services, and optimizing performance for data processing and storage.
Integrate Azure services, Snowflake technology, and other third-party data technologies: Seamlessly integrate various data technologies, including Azure services, Snowflake, and other third-party tools, to create a cohesive data ecosystem.
This involves configuring data connectors, ensuring data flow consistency, and managing dependencies between different systems.
Build and maintain high-quality data pipelines for analytic solutions: Develop robust data pipelines that automate the extraction, transformation, and loading (ETL) of data from various sources into a centralized data warehouse or lake.
Ensure these pipelines are efficient, reliable, and capable of handling large volumes of data.
Collaborate with a multidisciplinary agile team to generate insights from connected data Work closely with data scientists, analysts, and other team members in an agile environment to translate business requirements into technical solutions.
Participate in sprint planning, stand-ups, and retrospectives to ensure timely delivery of data products.
Manage and create data inventories for analytics and APIs to be consumed: Develop and maintain comprehensive data inventories that catalog available data assets and their metadata.
Ensure these inventories are accessible and usable by various stakeholders, including through APIs that facilitate data consumption.
Design data integrations with internal and external products: Architect and implement data integration solutions that enable seamless data exchange between internal systems and external partners or products.
This includes ensuring data integrity, security, and compliance with relevant standards.
Build data visualizations to support analytic insights: Create intuitive and insightful data visualizations using tools like PowerBI to help stakeholders understand complex data sets and derive actionable insights.
This involves designing dashboards, reports, and interactive visualizations that effectively communicate key metrics and trends.
Key Qualifications and Experiences:Bachelor's Degree in Management Information Systems/Technology, Computer Science, Engineering, or related discipline.
MBA or equivalent is preferred.
Minimum 7+ years of experience in designing large-scale data solutions, performing design assessments, crafting design options and analysis, finalizing preferred solution choice working with IT and Business stakeholders.
7+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting.
Expertise in data modeling principles/methods including, Conceptual, Logical & Physical Data Models for data warehouses, data lakes and/or database management systems.
5+ years of experience tailoring, configuring, and crafting solutions within the Snowflake environment, including a profound grasp of Snowflake's data warehousing capabilities, data architecture, SQL optimization for Snowflake, and leveraging Snowflake's unique features such as Snowpipe, Streams, and Tasks for real-time data processing and analytics.
A strong foundation in data migration strategies, performance tuning, and securing data within the Snowflake ecosystem is essential.
1-3+ years demonstrated expertise in architecting solutions within the Snowflake ecosystem, adhering to best practices in data architecture and design patterns.
5+ years of hands-on experience designing, building, and operationalizing data solutions and applications using cloud data and analytics services in combination with 3rd parties.
7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
7+ years of experience with database development and scripting.
Expertise in Amazon Marketing Cloud (AMC) integration and data analysis for marketing insights and campaign performance tracking.
Deep understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows: You should have a comprehensive knowledge of designing and implementing data systems that support various analytic and operational use cases, including data storage, processing, and retrieval.
Skilled in creating data products that support analytic solutions: Proficiency in developing data products that enable stakeholders to derive meaningful insights and make data-driven decisions.
This involves creating datasets, data models, and data services tailored to specific business needs.
Proficiency in working with APIs and understanding data structures to serve them: Experience in designing, developing, and consuming APIs for data access and integration.
This includes understanding various data structures and formats used in API communication.
Experience with Object-Relational Mapping (ORM) frameworks: Familiarity with ORM frameworks, such as Hibernate or Entity Framework, to efficiently map data between relational databases and application code.
Knowledge of managing sensitive data, ensuring data privacy and security: Expertise in handling sensitive data with strict adherence to data privacy regulations and security best practices to protect against unauthorized access and breaches.
Expertise in data visualization tools, specifically PowerBI: Proficiency in using data visualization tools like PowerBI to create interactive and insightful dashboards and reports that effectively communicate complex data insights.
Strong problem-solving skills and ability to work as part of a technical, cross-functional analytics team: Excellent analytical and troubleshooting abilities, with the capability to collaborate effectively with team members from various technical and business domains.
Experience with relational and non-relational databases (NoSQL, graph databases, etc.
): Solid experience in working with different types of databases, including traditional relational databases (e.g., SQL Server, MySQL) and non-relational databases (e.g., MongoDB, graph databases).
Agile learner with a passion for solving complex data problems and delivering insights: A proactive and continuous learner with enthusiasm for addressing challenging data issues and providing valuable insights through innovative solutions.
Proficiency in programming languages such as SQL, NoSQL, Python, Java, R, and Scala: Strong coding skills in multiple programming languages used for data manipulation, analysis, and pipeline development.
Familiarity with relational and non-relational databases, including GraphQL and MongoDB: In-depth understanding of both relational (SQL-based) and non-relational (NoSQL) databases, with specific experience in technologies like GraphQL and MongoDB.
Experience with ETL (extract, transform, and load) systems and API integrations: Expertise in building and maintaining ETL processes to consolidate data from various sources into centralized repositories, and integrating APIs for seamless data exchange.
Understanding of data storage solutions, knowing when to use a data lake versus a data warehouse: Knowledge of different data storage architectures and the ability to choose the appropriate solution (data lake or data warehouse) based on specific use cases and data characteristics.
Ability to write scripts for automation and repetitive task management: Proficiency in scripting languages (e.g., Python, Bash) to automate data processing tasks and reduce manual efforts.
Basic understanding of machine learning concepts to support data scientists on the team: Familiarity with key machine learning principles and techniques to better collaborate with data scientists and support their analytical models.
Proficiency with big data tools such as Hadoop, MongoDB, and Kafka: Experience in using big data technologies to manage, process, and analyze large datasets efficiently.
Knowledge of cloud computing, including cloud storage, product portfolios, and pricing models (Azure): Understanding of cloud platforms and services, particularly Azure, including storage options, available tools, and cost considerations.
Experience in data security, ensuring data is securely managed and stored to protect it from loss or theft while maintaining compliance: Strong background in implementing security measures to safeguard data and comply with regulatory requirements.
Total BenefitsGreat support for good health with medical, dental, and vision coverage options with no waiting periods or pre-existing condition restrictions.
Access to an on-site fitness center, occupational health nurse, and allowances for high-quality safety equipment.Flexible Savings and spending accounts to maximize health care options and stretch dollars when caring for yourself or dependents.Diverse income protection insurance options to protect yourself and your family in case of illness, injury, or other unexpected events.Additional programs and support to continue your education, adopt a child, relocate, or even find temporary childcare.To Be ConsideredClick the Apply button and complete the online application process.
A member of our recruiting team will review your application and follow up if you seem like a great fit for this role.In the meantime, check out the career's website.
You'll want to review this and come prepared with relevant questions when you pass GO and begin interviews.For Kimberly-Clark to grow and prosper, we must be an inclusive organization that applies the diverse experiences and passions of its team members to brands that make life better for people all around the world.
We actively seek to build a workforce that reflects the experiences of our consumers.
When you bring your original thinking to Kimberly-Clark, you fuel the continued success of our enterprise.
We are a committed equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation, gender, identity, age, pregnancy, genetic information, citizenship status, or any other characteristic protected by law.
The statements above are intended to describe the general nature and level of work performed by employees assigned to this classification.
Statements are not intended to be construed as an exhaustive list of all duties, responsibilities and skills required for this position.Additional information about the compensation and benefits for this role are available upon request.
You may contact -now.com for assistance.
You must include the six digit Job # with your request.This role is available for local candidates already authorized to work in the role's country only.
Kimberly-Clark will not provide relocation support for this role.Additional information about the compensation and benefits for this role are available upon request.
You may contact -now.com for assistance.
You must include the six digit Job # with your request.This role is available for local candidates already authorized to work in the role's country only.
Kimberly-Clark will not provide relocation support for this role.Primary LocationArgentina-Buenos AiresAdditional LocationsSao Paulo OfficeWorker TypeEmployeeWorker Sub-TypeRegularTime TypeFull time


Fuente: Talent_Ppc

Requisitos

Data Engineer - Marketing
Empresa:

Kimberly-Clark


Middleware

Tenemos una nueva propuesta laboral para vos. En Megatech, compañía dedicada a venta de servicios y consultoría del rubro IT, nos encontramos en la búsqueda ...


Desde Megatech - Capital Federal

Publicado a month ago

Desarrollador Backend Php, Framework Laravel Y Bbdd My Sql Modalidad Hibrida Zona De Trabajo Vicente Lopez

Importante empresa dedicada a brindar servicios de software a organismos públicos incorpora Desarrollador Backend PHP, Framework Laravel y BBDD My SQL.Skills...


Desde Consultoriait - Capital Federal

Publicado a month ago

Php Tech Lead, Nuñez, Caba Id 100/171

Nuestro cliente, Fintech líder en medios de pago digitales en LATAM Nos encontramos en la búsqueda de un Líder Técnico, con sólida experiencia en programació...


Desde Werben Hr - Capital Federal

Publicado a month ago

Qa Manual /Automation Ssr/Sr Recoleta, Caba Id 1000/504

Nuestro cliente está construyendo una plataforma que permite a inversores no sofisticados a través del crowdfunding y sin monto mínimo a alternativas de inve...


Desde Werben Hr - Capital Federal

Publicado a month ago

Built at: 2024-10-04T10:56:29.314Z