Current location - Education and Training Encyclopedia - Resume - Basic responsibilities of big data architects
Basic responsibilities of big data architects
Big data architects need to participate in planning the whole process from data source to data application, and participate in the decision-making of related products. The following are the basic responsibilities of the big data architect that I have carefully arranged for you.

Basic responsibilities of big data architects 1

Responsibilities:

1. Responsible for the design and construction of the whole big data platform architecture;

2. Responsible for building a general platform for data exchange and task scheduling of big data platform;

3. Formulate standards and specifications for development, testing, implementation and maintenance, guide and train engineers, and constantly improve team capabilities.

4. Participate in system requirements analysis, architecture design, technology selection, application design and development, test and deployment, and be responsible for writing core codes.

5. Continue to challenge new technical directions and overcome technical difficulties such as large data volume, high concurrency, high availability and scalability.

Requirements:

More than 65438 years +0.3 years experience in big data architecture, rich experience in data warehouse, data mining and machine learning projects.

2. Practical experience in the architecture and design of large-scale data processing.

3. Proficient in Spark, MR, HDFS, Yarn, Hbase, Hive, MongoDB, familiar with Kafka, Redis, Storm, Mahout, Flume, ElasticSearch, Graph DB (secondary or other), and rich experience in large-scale data platform engineering.

4. Deeply understand the related technologies and implementation methods of big data processing (stream computing, distributed computing, distributed file system, distributed storage, etc.). ).

5. Familiar with master data, metadata, data quality and other systems and methods related to enterprise data management, and familiar with the development environment on Linux/Unix platform.

6. Bachelor degree or above, major in computer software or related, with rich experience in java development and Internet background is preferred.

7. Strong problem analysis and handling ability, excellent hands-on ability, keen on technology and striving for perfection.

Basic responsibilities of big data architects 2

Responsibilities:

1. Deeply understand the business model of the government industry, build the data model of the government industry, and formulate the development route of the company's big data technology;

2. Docking business research and technical departments, actively collecting and transforming requirements, organizing data center business development, analyzing and designing data-related product requirements;

3. Build a data warehouse, develop a database management system, collect, extract and process the massive data accumulated by business, and conduct data analysis and mining;

4. According to the company's strategy and development needs, plan the key tasks and tasks of the data center; Implement departmental personnel and affairs management, carry out cross-departmental and cross-regional cooperation, and assist foreign exchanges and cooperation.

Job requirements:

1 above. 5 years relevant working experience, team management and project management experience is preferred;

2. Understand the government operation mechanism, master the knowledge of financial industry, and have experience in e-government industry is preferred;

3. Skillful use of Java or Python, proficient in database query languages such as SQL and Oracle, and experience in machine learning models and algorithms is preferred;

4. Have the overall thinking of data center product planning, and experience in big data processing, analysis and mining is preferred;

5. Rigorous logical thinking, able to abstract, decompose and standardize business, and excellent oral and written expression skills;

6. Have a strong sense of the overall situation and good sense of teamwork, have leadership skills, and have excellent interpersonal and communication skills.

Basic Responsibilities of Big Data Architects 3

Responsibilities:

1, engaged in business research, product standard construction, core model design and optimization, system testing and other related work of big data projects in the telecommunications industry.

2. Study the data modeling scheme and modeling tools together with the data professional committee, and be responsible for the data architecture and data model design of the product line.

3. Participate in the research of data conversion methods between databases, participate in data migration in the project, collect data migration experience in the project, and optimize the data model of the product.

4. Be responsible for training the basic theory of department team data model and establishing data model team.

Job requirements:

1, bachelor degree, more than 3 years experience in ETL design and development of mainstream data (DB2, Oracle, SQLServer, Mysql, etc.). ), have experience in designing logical model and physical model of large data warehouse, be proficient in SQL and have good experience in SQL performance tuning;

2. Have experience in using Python, R and other mathematical modeling tools, have some experience in data processing and modeling, and can output corresponding model analysis results, model comparison, model efficiency, theory and model judgment methods, and can fully explain and explain them;

3. Be familiar with the basic principles of statistics and have done practical data modeling projects;

4. Experience in distributed data warehouse construction, and experience in data warehouse construction in telecom industry is preferred;

Basic responsibilities of big data architects 4

Responsibilities:

1, responsible for the architecture design and core code development of the big data platform; Compile relevant technical documents according to project requirements;

2. Responsible for the architecture review, code review and online review of the big data platform; Participate in data application requirements, design, audit and review;

3. Responsible for core module research and development, big data platform construction, and complete system debugging, integration and implementation;

4. Responsible for establishing and maintaining the technical standards and specifications of the big data platform, and guiding developers to write codes;

Requirements:

1, bachelor degree or above in computer related major;

2. Proficient in offline and real-time data processing processes, master offline data processing frameworks such as hive, impala and spark-sql, and master common technical tools for real-time data processing such as Storm and SparkStreaming;

3. Familiar with the big data technology ecosystem, proficient in the big data technology architecture, and experienced in building a big data platform;

4. Master the commonly used data stream access tools, including Flume, kafka, etc.

5. Master the basic Linux operating system and some scripting languages (such as Shell). );

6. Master one or more real-time processing languages, such as JAVA, SCALA and PYTHON. , SCALA experience is preferred;

7. Experience in dealing with actual large-scale data (above TB level) is preferred;

Basic responsibilities of big data architects 5

Responsibilities:

1. Be responsible for the R&D and design of the company's big data processing framework, and sort out the realizable schemes and technical specifications;

2. Develop and improve the company's big data platform; Participate in the design, development and testing of offline and real-time big data processing systems and the automatic integration of multiple business modules;

3. Responsible for the design and planning of statistical analysis module of business platform data;

4. Responsible for data and storage design during product development;

5. Lead and train the team to complete the goal of organizational decomposition;

Requirements:

1, bachelor degree or above, major in computer and software engineering, with at least 8 years working experience and 5 years experience in big data development;

2. Familiar with the infrastructure of Java, Hadoop, HDFS, Hive, HBase, Spark, Storm, Flume and other related technologies.

3. Familiar with data warehouse theory, data algorithm and distributed computing technology, and have experience in designing the overall system architecture of big data;

4. Be familiar with Linux system and skillfully use shell/perl/python scripts to deal with problems;

5. Tensor flow and machine learning (svm random deep forest Bayes and other knowledge. ) priority;

6. Be able to organize project development teams to work together, including team communication, planning, development environment management, etc.