在亚马逊中与 Athena Amazon Glue 一起使用时表格不兼容 QuickSight - 亚马逊 QuickSight
Amazon Web Services 文档中描述的 Amazon Web Services 服务或功能可能因区域而异。要查看适用于中国区域的差异,请参阅 中国的 Amazon Web Services 服务入门 (PDF)

重要:我们已经重新设计了 Amazon QuickSight 分析工作空间。您可能会遇到无法反映 QuickSight 控制台新外观的屏幕截图或程序化文本。我们正在更新屏幕截图和过程文本。

要查找特征或项目,请使用快速搜索栏

有关新外观 QuickSight的更多信息,请参阅在 Amazon 上引入全新的分析体验 QuickSight

本文属于机器翻译版本。若本译文内容与英语原文存在差异,则一律以英文原文为准。

在亚马逊中与 Athena Amazon Glue 一起使用时表格不兼容 QuickSight

如果您在使用 Athena 和 A QuickSight mazon 中的Amazon Glue表格时遇到错误,可能是因为您缺少了一些元数据。请按照以下步骤查看您的表格是否不具有 Amazon 让 Athena 连接器正常工作 QuickSight 所需的TableType属性。通常,这些表的元数据未迁移到 Amazon Glue 数据目录。有关更多信息,请参阅《Amazon Glue 开发人员指南》中的分布升级到 Amazon Glue 数据目录

如果此时不希望迁移到 Amazon Glue 数据目录,您有两种选择。您可以通过 Amazon Glue 管理控制台重新创建每个 Amazon Glue 表。您也可以使用以下过程中列出的 Amazon CLI 脚本确定并更新缺少 TableType 属性的表格。

如果您更希望使用 CLI 执行此操作,请使用以下过程以帮助您设计脚本。

使用 CLI 设计脚本
  1. 可以使用 CLI 了解哪些 Amazon Glue 表没有 TableType 属性。

    aws glue get-tables --database-name <your_datebase_name>;

    例如,可以在 CLI 中运行以下命令。

    aws glue get-table --database-name "test_database" --name "table_missing_table_type"

    下面是输出内容的示例。可以看到,表 "table_missing_table_type" 未声明 TableType 属性。

    { "TableList": [ { "Retention": 0, "UpdateTime": 1522368588.0, "PartitionKeys": [ { "Name": "year", "Type": "string" }, { "Name": "month", "Type": "string" }, { "Name": "day", "Type": "string" } ], "LastAccessTime": 1513804142.0, "Owner": "owner", "Name": "table_missing_table_type", "Parameters": { "delimiter": ",", "compressionType": "none", "skip.header.line.count": "1", "sizeKey": "75", "averageRecordSize": "7", "classification": "csv", "objectCount": "1", "typeOfData": "file", "CrawlerSchemaDeserializerVersion": "1.0", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "recordCount": "9", "columnsOrdered": "true" }, "StorageDescriptor": { "OutputFormat": "org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat", "SortColumns": [], "StoredAsSubDirectories": false, "Columns": [ { "Name": "col1", "Type": "string" }, { "Name": "col2", "Type": "bigint" } ], "Location": "s3://myAthenatest/test_dataset/", "NumberOfBuckets": -1, "Parameters": { "delimiter": ",", "compressionType": "none", "skip.header.line.count": "1", "columnsOrdered": "true", "sizeKey": "75", "averageRecordSize": "7", "classification": "csv", "objectCount": "1", "typeOfData": "file", "CrawlerSchemaDeserializerVersion": "1.0", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "recordCount": "9" }, "Compressed": false, "BucketColumns": [], "InputFormat": "org.apache.hadoop.mapred.TextInputFormat", "SerdeInfo": { "Parameters": { "field.delim": "," }, "SerializationLibrary": "org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe" } } } ] }
  2. 在编辑器中编辑表定义,将 "TableType": "EXTERNAL_TABLE" 添加到表定义,如下例所示。

    { "Table": { "Retention": 0, "TableType": "EXTERNAL_TABLE", "PartitionKeys": [ { "Name": "year", "Type": "string" }, { "Name": "month", "Type": "string" }, { "Name": "day", "Type": "string" } ], "UpdateTime": 1522368588.0, "Name": "table_missing_table_type", "StorageDescriptor": { "BucketColumns": [], "SortColumns": [], "StoredAsSubDirectories": false, "OutputFormat": "org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat", "SerdeInfo": { "SerializationLibrary": "org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe", "Parameters": { "field.delim": "," } }, "Parameters": { "classification": "csv", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "columnsOrdered": "true", "averageRecordSize": "7", "objectCount": "1", "sizeKey": "75", "delimiter": ",", "compressionType": "none", "recordCount": "9", "CrawlerSchemaDeserializerVersion": "1.0", "typeOfData": "file", "skip.header.line.count": "1" }, "Columns": [ { "Name": "col1", "Type": "string" }, { "Name": "col2", "Type": "bigint" } ], "Compressed": false, "InputFormat": "org.apache.hadoop.mapred.TextInputFormat", "NumberOfBuckets": -1, "Location": "s3://myAthenatest/test_date_part/" }, "Owner": "owner", "Parameters": { "classification": "csv", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "columnsOrdered": "true", "averageRecordSize": "7", "objectCount": "1", "sizeKey": "75", "delimiter": ",", "compressionType": "none", "recordCount": "9", "CrawlerSchemaDeserializerVersion": "1.0", "typeOfData": "file", "skip.header.line.count": "1" }, "LastAccessTime": 1513804142.0 } }
  3. 您可以改写以下脚本以更新表输入,使其包含 TableType 属性。

    aws glue update-table --database-name <your_datebase_name> --table-input <updated_table_input>

    下面是一个示例。

    aws glue update-table --database-name test_database --table-input ' { "Retention": 0, "TableType": "EXTERNAL_TABLE", "PartitionKeys": [ { "Name": "year", "Type": "string" }, { "Name": "month", "Type": "string" }, { "Name": "day", "Type": "string" } ], "Name": "table_missing_table_type", "StorageDescriptor": { "BucketColumns": [], "SortColumns": [], "StoredAsSubDirectories": false, "OutputFormat": "org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat", "SerdeInfo": { "SerializationLibrary": "org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe", "Parameters": { "field.delim": "," } }, "Parameters": { "classification": "csv", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "columnsOrdered": "true", "averageRecordSize": "7", "objectCount": "1", "sizeKey": "75", "delimiter": ",", "compressionType": "none", "recordCount": "9", "CrawlerSchemaDeserializerVersion": "1.0", "typeOfData": "file", "skip.header.line.count": "1" }, "Columns": [ { "Name": "col1", "Type": "string" }, { "Name": "col2", "Type": "bigint" } ], "Compressed": false, "InputFormat": "org.apache.hadoop.mapred.TextInputFormat", "NumberOfBuckets": -1, "Location": "s3://myAthenatest/test_date_part/" }, "Owner": "owner", "Parameters": { "classification": "csv", "CrawlerSchemaSerializerVersion": "1.0", "UPDATED_BY_CRAWLER": "crawl_date_table", "columnsOrdered": "true", "averageRecordSize": "7", "objectCount": "1", "sizeKey": "75", "delimiter": ",", "compressionType": "none", "recordCount": "9", "CrawlerSchemaDeserializerVersion": "1.0", "typeOfData": "file", "skip.header.line.count": "1" }, "LastAccessTime": 1513804142.0 }'