Comcast Eng 4, Software Dev & Engineering in West Chester, Pennsylvania
Comcast's Technology & Product organization works at the intersection of media and technology. Our innovative teams are continually developing and delivering products that transform the customer experience. From creating apps like TVGo to new features such as the Talking Guide on the X1 platform, we work every day to make a positive impact through innovation in the pursuit of building amazing products that are enjoyable, easy to use and accessible across all platforms. The team also develops and supports our evolving network architecture, including next-generation consumer systems and technologies, infrastructure and engineering, network integration and management tools, and technical standards.
The dx Team has responsibility including Data Engineering for Comcast; one of the major goal is to harmonize the data ingestion and consumption layer across Comcast. Creating enterprise data sources as a single version of truth is a goal of dx Team.
The TL Data Engineer will provide ETL solutions to answer technically challenging business requirements (complex transformations, high data volume ).The ideal candidate will have a deep understanding of technical and functional designs for ETL, Database, Data Warehousing, and Reporting areas. This job plays a key role in Customer 360 projects, systems design and development. This team is accountable for ensuring that the overall functional and technical design of data engineering processes and applications meet business and functional requirements. In addition, the work must be sustainable and conform to operation infrastructure.
This position will be responsible for designing, developing, testing, tuning and the deployment of software solutions within Hadoop, AWS and Teradata Ecosystems.
The Engineer will work closely with administrators and Data Product team to insure data for various subject areas is highly available and performing within agreed upon service levels. This position will also work closely with architects and other data engineering teams in an agile manner to quickly realize business value.
-Play a key role at a senior-level to the ETL and Data Warehouse by implementing a solid, robust, extensible design that supports key business flows;
-Build and maintain optimized ETL solutions to process/load source systems data into AWS, Hadoop using Spark, Sqoop or REST API, Teradata using Informatica and other utilities.
-Develop, implement and maintain development best practices within Hadoop and Teradata environments
-Strong experience with Spark, Hive, Pig, Flume, Sqoop, Kafka, and Storm
-Work with peers in administration to tune code and plan for capacity needs
-Deliver clear, well-communicated and complete data engineering artifacts.
-Analyze and solve problems and recommend improvements to existing systems and processes.
-Design, code and test major segments of a system in a timely manner.
-Leads unit, system acceptance, and performance testing by designing test cases, building test data, test execution and evaluation, along with recommending/making improvements/fixes to the system.
-Ensure data security, data quality and governance of data within Hadoop and Teradata ecosystem
-Knowledge in data warehousing methodologies and best practices required.
-Strong verbal and written communication skills required.
-Effective interpersonal relations skills, ability to effectively collaborate with others and work as part of a team required.
-Skills in navigating a large organization in order to accomplish results required.
-Ability to initiate and follow through on complex projects of both short and long term duration required.
-Excellent organizational and time management skills required.
-Excellent analytical and problem solving skills required.
-Works independently, assumes responsibility for job development and training, researches and resolves questions and problems, requests supervisor input and keeps supervisor informed required.
-Participate on interdepartmental teams to support organizational goals
-Perform other related duties and tasks as assigned
-Punctual, regular, and consistent attendance
-Responsible to deliver Solution design
-Responsible to deliver Solution (deliverable)
-Responsible to deliver SRE documentation and turnover
-Responsible to deliver Best Practices and Standards
-Bachelor's Degree in Computer Science or a related field is required.
-7 year of experience working as data integration, data solution architect, ETL architect or similar role required.
-Five to seven years with data integration, Development of End to End ETL architecture using AWS, Hadoop, Linux, Informatica, Teradata, SQL, BTEQ.
-Hands-on experience in developing applications utilizing Hadoop Ecosystem components e.g. Spark, Sqoop, Hive, Pig, Flume, Accumulo, HBase, Kafka, Storm
-Requires understanding of complete SDLC and Experience with continuous integration, test-driven/behavior-driven development, and agile, scrum development methodologies
-Ability to work effectively across organizational boundaries
-Excellent oral, written, analytical, problem solving, and presentation skills
-Manage and Co-ordinate matrix resources
-Experience with managed-service and on-shore / off-shore development experience is must
Desired Skills/ Experience
- Telecommunications experience Knowledge of Telecommunication/Cable billing, customer care systems e.g. Amdocs, DST, CSG etc.
-Knowledge of NoSQL platforms;
-Hadoop, Teradata, TOGAF Certified
Comcast is an EEO/AA/ Drug Free Workplace
Comcast is an EOE/Veterans/Disabled/LGBT employer