bootleg.layers package

Submodules

bootleg.layers.alias_to_ent_encoder module

AliasEntityTable class.

class bootleg.layers.alias_to_ent_encoder.AliasEntityTable(data_config, entity_symbols)[source]

Bases: torch.nn.modules.module.Module

Stores table of the K candidate entity ids for each alias.

Parameters
  • data_config – data config

  • entity_symbols – entity symbols

classmethod build_alias_table(data_config, entity_symbols)[source]

Construct the alias to EID table.

Parameters
  • data_config – data config

  • entity_symbols – entity symbols

Returns: numpy array where row is alias ID and columns are EID

forward(alias_indices)[source]

Model forward.

Parameters

alias_indices – alias indices (B x M)

Returns: entity candidate EIDs (B x M x K)

get_alias_eid_priors(alias_indices)[source]

Return the prior scores of the given alias_indices.

Parameters

alias_indices – alias indices (B x M)

Returns: entity candidate normalized scores (B x M x K x 1)

classmethod prep(data_config, entity_symbols, num_aliases_with_pad_and_unk, num_cands_K)[source]

Preps the alias to entity EID table.

Parameters
  • data_config – data config

  • entity_symbols – entity symbols

  • num_aliases_with_pad_and_unk – number of aliases including pad and unk

  • num_cands_K – number of candidates per alias (aka K)

Returns: torch Tensor of the alias to EID table, save pt file

training: bool

bootleg.layers.bert_encoder module

BERT encoder.

class bootleg.layers.bert_encoder.Encoder(transformer, out_dim)[source]

Bases: torch.nn.modules.module.Module

Encoder module.

Return the CLS token of Transformer.

Parameters
  • transformer – transformer

  • out_dim – out dimension to project to

forward(token_ids, segment_ids=None, attention_mask=None)[source]

BERT Encoder forward.

training: bool

Module contents

Layer init.