Skip to content

Conversation

yandai
Copy link

@yandai yandai commented Jul 13, 2023

I think tiled_prompt_lengths_buf_ should be initialized to zero.
tiled_prompt_lengths_buf_

When invokeMaskPaddingTokens uses uninitialized tiled_prompt_lengths_buf_, result should be uncorrect.
https://github.com/Nvidia/FasterTransformer/blob/main/src/fastertransformer/models/gptneox/GptNeoX.cc#L760

@yandai yandai changed the title fix: initialize tiled_prompt_lengths_buf_ to zero in gptneo fix: initialize tiled_prompt_lengths_buf_ to zero in gptneox Jul 13, 2023
@RobotGF RobotGF mentioned this pull request Sep 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant