Madison, Wisconsin
Job Type
Reference Number
Wethal,Deb, Spherion
Apply Now

Job Description

Spherion, in partnership with American Family Insurance, is sourcing for a Technical Consultant. The Technical Consultant for DSAL (Data Science & Analytics Lab) will contribute to our mission through the design, development, implementation and maintenance of high performance parallel and distributed application and infrastructure systems. The position will lead the development of our data processing systems and their interfaces, which support our efforts, with a focus on enabling scalable, high performance, distributed computing environments. The Technical Consultant is responsible for implementing new data processing technologies and building prototypes. The position will contribute to the implementation of machine learning and statistical algorithms, including making them more efficient and scalable. As a key member of the Data Science team, this position will work closely with the scientists and will contribute to the modeling and data mining efforts as desired and needed.

Working hours: M-F 1st shift

Job Duties
Data Science Infrastructure Design, Development, and Operations (75%)
--Participate in architectural planning, design, development, deployment, and management of analytical environments capable of ingesting, processing, and analyzing large, diverse data sets
--Participates in developing holistic solution architectures ensuring that all architectural aspects of the system including data, application, infrastructure and security are addressed.
--Develops high performance parallel or distributed computing environments, as needed, and including those based on the Apache Stack (Spark, Spark Streaming, as well as Amazon Web Services (AWS) such as Kenisis, etc.

Data Science Research Support (20%)
--Participate in the definition and planning in the areas of data processing and scalable analytical and computational platforms.
--Maximize the predictability, efficiency, effectiveness, and maintainability of data science-related infrastructure elements with a focus on analytical compute environments.
--Develop means for automating data- and analytics-related systems and processes, as appropriate, to support data science activities.

Desired skills: (The following are not necessarily listed in order of priority:)
--Java, Spark, Spark Streaming, Hive, Kafka also experience or familiarity with AWS Kenisis, S3 andEC2 instances, CloudFormation, Lambda, DynamoDB.
--Desirable but not required: Tableau, NiFi, Atlassian tools
--Java 7/8, Spring Framework (MVC and Boot used heavily), SQL/MySQL/PostgreSQL, Maven, Git, Docker, Linux (CentOS and Ubuntu), Bash, Web-service/micro-service/REST architecture, AWS (CloudFormation, EC2, S3, Kinesis, Lambda, RDS, DynamoDB) or similar cloud experience (e.g. Google Cloud).
--Spark, Spark Streaming, and Hadoop Ecosystem (MR, Hive, Kafka, etc). Tools are secondary, but principles of large scale data ingest/analysis is important.

The individual should have work experience in the above skill areas.

Spherion is a world leader in matching great people with great companies. Our experienced agents will listen carefully to your employment needs and then work diligently to match your skills and qualifications to the right job and company. Whether you're looking for temporary, temporary-to-permanent or permanent opportunities, no one works harder for you than Spherion. EEO Employer: Race, Religion, Color, National Origin, Citizenship, Sex, Age, Disability, Ancestry, Veteran Status, Genetic Information, Service in the Uniformed Services or any other classification protected by law.


High School

Apply Now