HBase API 与 Shell 操作实战:创建、插入数据、查询和修改
1. 使用 API 在 HBase 中创建 Stu_Class 表,并在 Hbase shell 中展示所有的表。
概要说明: 使用 Java API 创建一个名为 'Stu_Class' 的表,包含两个列族 'basic_info' 和 'course_info'。之后使用 HBase Shell 命令 'list' 查看所有表。
代码:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
public class CreateTable {
public static void main(String[] args) throws Exception {
Configuration conf = HBaseConfiguration.create();
Connection conn = ConnectionFactory.createConnection(conf);
Admin admin = conn.getAdmin();
TableName tableName = TableName.valueOf('Stu_Class');
if (admin.tableExists(tableName)) {
System.out.println('Table already exists!');
} else {
HTableDescriptor tableDescriptor = new HTableDescriptor(tableName);
tableDescriptor.addFamily(new HColumnDescriptor('basic_info'));
tableDescriptor.addFamily(new HColumnDescriptor('course_info'));
admin.createTable(tableDescriptor);
System.out.println('Table created successfully!');
}
admin.close();
conn.close();
}
}
命令:
hbase shell
list
结果截图:

2. 使用 API 将数据插入到表中
概要说明: 使用 Java API 插入三条数据到 'Stu_Class' 表,每条数据包含学生 ID、姓名、性别、课程 ID 和分数信息。
代码:
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Table;
public class InsertData {
public static void main(String[] args) throws Exception {
Configuration conf = HBaseConfiguration.create();
Connection conn = ConnectionFactory.createConnection(conf);
Table table = conn.getTable(TableName.valueOf('Stu_Class'));
List<Put> puts = new ArrayList<>();
Put put1 = new Put('1001'.getBytes());
put1.addColumn('basic_info'.getBytes(), 'name'.getBytes(), '张三'.getBytes());
put1.addColumn('basic_info'.getBytes(), 'gender'.getBytes(), '男'.getBytes());
put1.addColumn('course_info'.getBytes(), 'course_id'.getBytes(), '3-245'.getBytes());
put1.addColumn('course_info'.getBytes(), 'score'.getBytes(), '90'.getBytes());
puts.add(put1);
Put put2 = new Put('1002'.getBytes());
put2.addColumn('basic_info'.getBytes(), 'name'.getBytes(), '李四'.getBytes());
put2.addColumn('basic_info'.getBytes(), 'gender'.getBytes(), '女'.getBytes());
put2.addColumn('course_info'.getBytes(), 'course_id'.getBytes(), '3-245'.getBytes());
put2.addColumn('course_info'.getBytes(), 'score'.getBytes(), '85'.getBytes());
puts.add(put2);
Put put3 = new Put('1003'.getBytes());
put3.addColumn('basic_info'.getBytes(), 'name'.getBytes(), '王五'.getBytes());
put3.addColumn('basic_info'.getBytes(), 'gender'.getBytes(), '男'.getBytes());
put3.addColumn('course_info'.getBytes(), 'course_id'.getBytes(), '3-246'.getBytes());
put3.addColumn('course_info'.getBytes(), 'score'.getBytes(), '95'.getBytes());
puts.add(put3);
table.put(puts);
table.close();
conn.close();
System.out.println('Data inserted successfully!');
}
}
3. 使用 Shell 命令查询表中所有数据
命令:
scan 'Stu_Class'
结果截图:

4. 使用 Shell 将 1003 行的 cname 改为 网络技术,并查询 1003 行修改后的数据
命令:
put 'Stu_Class', '1003', 'course_info:course_id', '网络技术'
get 'Stu_Class', '1003'
结果截图:

5. 使用 API 查询选修课程为 3-245 的学生信息
概要说明: 使用 Java API 查询选修课程为 '3-245' 的所有学生信息。
代码:
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellScanner;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.client.Table;
import org.apache.hadoop.hbase.filter.CompareFilter.CompareOp;
import org.apache.hadoop.hbase.filter.SingleColumnValueFilter;
import org.apache.hadoop.hbase.util.Bytes;
public class QueryData {
public static void main(String[] args) throws Exception {
Configuration conf = HBaseConfiguration.create();
Connection conn = ConnectionFactory.createConnection(conf);
Table table = conn.getTable(TableName.valueOf('Stu_Class'));
Scan scan = new Scan();
SingleColumnValueFilter filter = new SingleColumnValueFilter(Bytes.toBytes('course_info'), Bytes.toBytes('course_id'), CompareOp.EQUAL, Bytes.toBytes('3-245'));
scan.setFilter(filter);
ResultScanner resultScanner = table.getScanner(scan);
for (Result result : resultScanner) {
CellScanner cellScanner = result.cellScanner();
while (cellScanner.advance()) {
Cell cell = cellScanner.current();
System.out.print(Bytes.toString(CellUtil.cloneRow(cell)) + ' ');
System.out.print(Bytes.toString(CellUtil.cloneFamily(cell)) + ':');
System.out.print(Bytes.toString(CellUtil.cloneQualifier(cell)) + ' ');
System.out.println(Bytes.toString(CellUtil.cloneValue(cell)));
}
System.out.println('--------------------');
}
table.close();
conn.close();
}
}
结果截图:

原文地址: https://www.cveoy.top/t/topic/owxI 著作权归作者所有。请勿转载和采集!