- StarRocks
- Introduction to StarRocks
- Quick Start
- Table Design
- Data Loading
- Concepts
- Overview of data loading
- Load data from a local file system or a streaming data source using HTTP PUT
- Load data from HDFS or cloud storage
- Continuously load data from Apache Kafka®
- Bulk load using Apache Sparkâ„¢
- Load data using INSERT
- Load data using Stream Load transaction interface
- Synchronize data from MySQL in real time
- Continuously load data from Apache Flink®
- Change data through loading
- Transform data at loading
- Data Unloading
- Query Data Sources
- Query Acceleration
- Gather CBO statistics
- Materialized view
- Colocate Join
- Lateral Join
- Index
- Computing the Number of Distinct Values
- Administration
- Deployment
- Management
- Data Recovery
- User Privilege and Authentication
- Performance Tuning
- Reference
- SQL Reference
- User Account Management
- Cluster Management
- ADD SQLBLACKLIST
- ADMIN CANCEL REPAIR TABLE
- ADMIN CHECK TABLET
- ADMIN REPAIR TABLE
- ADMIN SET CONFIG
- ADMIN SET REPLICA STATUS
- ADMIN SHOW CONFIG
- ADMIN SHOW REPLICA DISTRIBUTION
- ADMIN SHOW REPLICA STATUS
- ALTER RESOURCE GROUP
- ALTER SYSTEM
- CANCEL DECOMMISSION
- CREATE FILE
- CREATE RESOURCE GROUP
- DELETE SQLBLACKLIST
- DROP FILE
- DROP RESOURCE GROUP
- EXPLAIN
- INSTALL PLUGIN
- KILL
- SET
- SHOW BACKENDS
- SHOW BROKER
- SHOW COMPUTE NODES
- SHOW FILE
- SHOW FRONTENDS
- SHOW FULL COLUMNS
- SHOW INDEX
- SHOW PLUGINS
- SHOW PROC
- SHOW PROCESSLIST
- SHOW RESOURCE GROUP
- SHOW SQLBLACKLIST
- SHOW TABLE STATUS
- SHOW VARIABLES
- UNINSTALL PLUGIN
- DDL
- ALTER DATABASE
- ALTER MATERIALIZED VIEW
- ALTER TABLE
- ALTER VIEW
- ALTER RESOURCE
- ANALYZE TABLE
- BACKUP
- CANCEL ALTER TABLE
- CANCEL BACKUP
- CANCEL RESTORE
- CREATE ANALYZE
- CREATE EXTERNAL CATALOG
- CREATE DATABASE
- CREATE INDEX
- CREATE MATERIALIZED VIEW
- CREATE REPOSITORY
- CREATE RESOURCE
- CREATE TABLE AS SELECT
- CREATE TABLE LIKE
- CREATE TABLE
- CREATE VIEW
- CREATE FUNCTION
- DROP ANALYZE
- DROP STATS
- DROP CATALOG
- DROP DATABASE
- DROP INDEX
- DROP MATERIALIZED VIEW
- DROP REPOSITORY
- DROP RESOURCE
- DROP TABLE
- DROP VIEW
- DROP FUNCTION
- HLL
- KILL ANALYZE
- RECOVER
- REFRESH EXTERNAL TABLE
- RESTORE
- SHOW ANALYZE JOB
- SHOW ANALYZE STATUS
- SHOW META
- SHOW RESOURCES
- SHOW FUNCTION
- TRUNCATE TABLE
- USE
- DML
- ALTER ROUTINE LOAD
- BROKER LOAD
- CANCEL LOAD
- CANCEL EXPORT
- CANCEL REFRESH MATERIALIZED VIEW
- CREATE ROUTINE LOAD
- DELETE
- EXPORT
- GROUP BY
- INSERT
- PAUSE ROUTINE LOAD
- REFRESH MATERIALIZED VIEW
- RESUME ROUTINE LOAD
- SELECT
- SHOW ALTER TABLE
- SHOW ALTER MATERIALIZED VIEW
- SHOW BACKUP
- SHOW CATALOGS
- SHOW CREATE MATERIALIZED VIEW
- SHOW CREATE TABLE
- SHOW CREATE VIEW
- SHOW DATA
- SHOW DATABASES
- SHOW DELETE
- SHOW DYNAMIC PARTITION TABLES
- SHOW EXPORT
- SHOW LOAD
- SHOW MATERIALIZED VIEW
- SHOW PARTITIONS
- SHOW PROPERTY
- SHOW REPOSITORIES
- SHOW RESTORE
- SHOW ROUTINE LOAD
- SHOW ROUTINE LOAD TASK
- SHOW SNAPSHOT
- SHOW TABLES
- SHOW TABLET
- SHOW TRANSACTION
- SPARK LOAD
- STOP ROUTINE LOAD
- STREAM LOAD
- Auxiliary Commands
- Data Types
- Keywords
- Function Reference
- Java UDFs
- Window functions
- Aggregate Functions
- Array Functions
- Bit Functions
- Bitmap Functions
- Conditional Functions
- Cryptographic Functions
- Date Functions
- add_months
- adddate
- convert_tz
- current_date
- current_time
- current_timestamp
- date
- date_add
- date_format
- date_sub, subdate
- date_trunc
- datediff
- day
- dayname
- dayofmonth
- dayofweek
- dayofyear
- days_add
- days_diff
- days_sub
- from_days
- from_unixtime
- hour
- hours_add
- hours_diff
- hours_sub
- microseconds_add
- microseconds_sub
- minute
- minutes_add
- minutes_diff
- minutes_sub
- month
- monthname
- months_add
- months_diff
- now
- quarter
- second
- seconds_add
- seconds_diff
- seconds_sub
- str_to_date
- str2date
- time_slice
- time_to_sec
- timediff
- timestamp
- timestampadd
- timestampdiff
- to_date
- to_days
- unix_timestamp
- utc_timestamp
- week
- weekofyear
- weeks_add
- weeks_diff
- weeks_sub
- year
- years_add
- years_diff
- years_sub
- Geographic Functions
- JSON Functions
- Overview of JSON functions and operators
- JSON operators
- JSON constructor functions
- JSON query and processing functions
- Math Functions
- String Functions
- Pattern Matching Functions
- Percentile Functions
- Scalar Functions
- Utility Functions
- cast function
- hash function
- System variables
- Error code
- System limits
- SQL Reference
- FAQ
- Deploy
- Data Migration
- SQL
- Other FAQs
- Benchmark
- Developers
- Contribute to StarRocks
- Code Style Guides
- Use the debuginfo file for debugging
- Development Environment
- Trace Tools
- Integration
Deploy a StarRocks Cluster on AWS by Using AWS CloudFormation
StarRocks supports integration with AWS CloudFormation since version 2.3.0. You can use AWS CloudFormation to quickly deploy and use StarRocks clusters on AWS.
AWS CloudFormation
AWS CloudFormation is provided by AWS to help you conveniently and fastly model and set up AWS resources and third-party resources, such as StarRocks clusters, so that you can spend less time managing resources and more time using them. You create a template that describes the resources you need, and AWS CloudFormation takes care of configuring those resources for you. For more information, see What is AWS CloudFormation?
Basic Concepts
Templates
Templates are JSON or YAML formatted text files that describe AWS resources and third-party resources, as well as the properties of those resources. For more information, see Templates.
Stacks
Stacks are used to create and manage the resources described in templates. You can create, update, and delete a set of resources by creating, updating, and deleting stacks. All resources in a stack are defined by a template. Suppose you have created a template that describes various resources. To configure these resources, you need to create a stack by submitting the template that you created, and AWS CloudFormation then configures all those resources for you. For more information, see Stacks.
Procedure
Log in to the AWS CloudFormation console.
Choose Create stack > With new resources (standard).
Follow these steps to specify the template:
- Under Prerequisite - Prepare template, choose Template is ready.
- Under Specify template, choose Template source as Amazon S3 URL. And under Amazon S3 URL, enter the following address:
https://cf-templates-1euv6e68138u2-us-east-1.s3.amazonaws.com/templates/starrocks.template.yaml
Note: You can also set Template source to Upload a template file. Click Choose file to upload the starrocks.template.yaml file. You can download the starrocks.template.yaml file from the aws-cloudformation repository in the StarRocks project. .
- Click next.
Specify the stack details, including the Stack name and Parameters. Click Next.
Type a stack name in the Stack name field.
The stack name is an identifier that helps you find a particular stack from a list of stacks. A stack name can contain only letters (case-sensitive), characters, and hyphens(-). It must start with a letter and can be up to 128 characters in length.
Configure the following parameters.
Type Parameter Description Network configuration Availability Zones Select an availability zone for deploying the StarRocks cluster. For more information, see Regions and Zones. EC2 configuration Key pair name A key pair, consisting of a public key and a private key, is a set of security credentials that you use to prove your identity when you attempt to connect to an Amazon EC2 instance. For more information, see key pairs.Note: If you need to create a key pair, see Create key pairs. Environment configuration Reference the latest Amazon Linux AMI in a CloudFormation template The ID of the latest Amazon Machine Image (AMI) with the 64-bit architecture ( x86_64
), which is used to launch an Amazon EC2 instance.Note: An AMI is a supported and maintained image provided by AWS. It provides the information required to launch an Amazon EC2 instance. For more information, see Amazon Machine Images.URL of download JDK 1.8 The URL from which you can download JDK 1.8. URL of StarRocks The URL from which you can download the StarRocks binary package. StarRocks Cluster configuration Number of StarRocks Fe The number of FEs. Default value: 1. Valid values: 1 and 3. Fe instance type The instance type of the Amazon EC2 to which a FE node belongs. Default value: t2.micro. For more information about instance types, see Amazon EC2 Instance Types. Number of StarRocks Be The number of BEs. Default value: 3. Valid values: 3 and 6. Be instance type The instance type of the Amazon EC2 to which the BE nodes belong. Default value: t2.micro. For more information about instance types, see Amazon EC2 Instance Types. Fe configuration Dir to save fe log The path of the FE log file. Only absolute paths are allowed. Sys Log Level The logging level of FE. Default level: INFO. Valid levels: INFO, WARN, ERROR and FATAL. Meta data dir The path of the FE metadata file. The value of this parameter must be an absolute path. The default value is feDefaultMetaPath, which means to use /home/starrocks/StarRocks/fe/meta. BE configuration Dir to save be sys log The path of the log file on each BE. The value of this parameter must be an absolute path. Sys Log Level The log level of each BE. Default level: INFO. Valid levels: INFO, WARN, ERROR and FATAL. Volume type of Be nodes The type of the Amazon EBS volume attached to the Amazon EC2 instances to which each BE belongs. An Amazon EBS volume is a block-level storage device that you can attach to Amazon EC2 instances. For more information, see Amazon EBS volumes. Volume size of Be nodes The storage capacity of EBS volumes that BE nodes use to store data. Unit: GB.
Configure more options about the stack. For more information, see Setting AWS CloudFormation stack options.
After the configuration is completed, click Next.
Review the stack information you entered, including the template, details, and more options. You can also estimate the cost of your stack. For more information, see Reviewing your stack and estimating stack cost on the AWS CloudFormation console.
Note: If you need to change any of the parameters, click Edit on the top right corner of the related section to go back to the relevant page.
Select the following two check boxes and click Create stack.