Python Developer - Solution Specialist - USDC
Are you an experienced, passionate pioneer in technology; are solutions your focus, a roll-up-your-sleeves individual who thrives in a daily collaborative environment, a think-tank who can share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon.
Interested? Read more about our opportunity below .
.
.
Work You'll Do/Responsibilities
Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements.
Be responsible for developing and testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation.
Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities.
Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments.
Guide clients to the future IT environment state to support meeting their long-term business goals.
Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data.
Required Qualifications
3+ years of Python development experience
Strong technical skills including understanding of software development principles
Experience with data lakes, datahub implementation
Knowledge of AWS or Azure platforms
Able to translate business requirements into logical and physical file structure design
Ability to build and test solution in agile delivery manner
Ability to articulate reasons behind the design choices being made
Bachelor's degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline, or equivalent experience
Ability to travel 10%, on average, based on the work you do and the clients and industries/sectors you serve.
This may include overnight travel
Expected to co-locate in your designated office/USDC location up to 30% of the time
Must live in a commutable distance to or be willing to relocate to one of the following Delivery locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA
Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Preferred Qualifications
3+ years of experience working with Big Data eco-system including tools such as Hadoop, Spark, Map Reduce, Sqoop, HBase, Hive and Impala
Experience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M, Presto, No SQL, SQL
Expert level usage with Jenkins, GitHub