Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug][Jira] Data too long for column 'component' error while collecting/processing data #8208

Open
3 tasks done
aviwarner opened this issue Nov 21, 2024 · 6 comments
Open
3 tasks done
Labels
component/plugins This issue or PR relates to plugins severity/p1 This bug affects functionality or significantly affect ux type/bug This issue is a bug

Comments

@aviwarner
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

Data too long for column 'component' error while collecting/processing data. I have seen some similar issues, but in this case, the data type doesn't seem to be a

I checked the db and in the _tool_jira_issues table found a components field where the data type is text (doesn't appear to be a character limited field). Not exactly sure how I might track down the offending jira issue for further troubleshooting.

Interestingly, my first collection ran fine. Only subsequent ones have had this error.

Attempted to upgrade to latest version, also deleted and recreated the project and connection without change.

attached stack trace -- stack trace: | github.com/apache/incubator-devlake/server/services.runPipeline | /app/server/services/pipeline_runner.go:79 | [...repeated from below...] Wraps: (2) Error running pipeline 18. Wraps: (3) attached stack trace -- stack trace: | github.com/apache/incubator-devlake/server/services.RunTasksStandalone | /app/server/services/task.go:217 | github.com/apache/incubator-devlake/server/services.(*pipelineRunner).runPipelineStandalone.func1 | /app/server/services/pipeline_runner.go:42 | github.com/apache/incubator-devlake/core/runner.runPipelineTasks | /app/core/runner/run_pipeline.go:90 | github.com/apache/incubator-devlake/core/runner.RunPipeline | /app/core/runner/run_pipeline.go:54 | github.com/apache/incubator-devlake/server/services.(*pipelineRunner).runPipelineStandalone | /app/server/services/pipeline_runner.go:38 | github.com/apache/incubator-devlake/server/services.runPipeline | /app/server/services/pipeline_runner.go:76 | github.com/apache/incubator-devlake/server/services.RunPipelineInQueue.func1 | /app/server/services/pipeline.go:349 | runtime.goexit | /usr/local/go/src/runtime/asm_arm64.s:1172 Wraps: (4) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/server/services.RunTasksStandalone.func1 | | /app/server/services/task.go:189 | Wraps: (2) Error running task 223. | Wraps: (3) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/core/runner.RunPluginSubTasks | | /app/core/runner/run_task.go:333 | | github.com/apache/incubator-devlake/core/runner.RunPluginTask | | /app/core/runner/run_task.go:165 | | github.com/apache/incubator-devlake/core/runner.RunTask | | /app/core/runner/run_task.go:139 | | github.com/apache/incubator-devlake/server/services.runTaskStandalone | | /app/server/services/task_runner.go:114 | | github.com/apache/incubator-devlake/server/services.RunTasksStandalone.func1 | | /app/server/services/task.go:187 | Wraps: (4) subtask convertIssues ended unexpectedly | Wraps: (5) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*StatefulDataConverter[...]).Execute | | /app/helpers/pluginhelper/api/data_convertor_stateful.go:184 | | [...repeated from below...] | Wraps: (6) error adding result to batch (500) | Wraps: (7) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/impls/dalgorm.(*Dalgorm).convertGormError | | /app/impls/dalgorm/dalgorm.go:547 | | github.com/apache/incubator-devlake/impls/dalgorm.(*Dalgorm).CreateOrUpdate | | /app/impls/dalgorm/dalgorm.go:265 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*BatchSave).flushWithoutLocking | | /app/helpers/pluginhelper/api/batch_save.go:131 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*BatchSave).Add | | /app/helpers/pluginhelper/api/batch_save.go:109 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*StatefulDataConverter[...]).Execute | | /app/helpers/pluginhelper/api/data_convertor_stateful.go:182 | | github.com/apache/incubator-devlake/plugins/jira/tasks.ConvertIssues | | /app/plugins/jira/tasks/issue_convertor.go:164 | | github.com/apache/incubator-devlake/core/runner.runSubtask | | /app/core/runner/run_task.go:425 | | github.com/apache/incubator-devlake/core/runner.RunPluginSubTasks | | /app/core/runner/run_task.go:330 | | github.com/apache/incubator-devlake/core/runner.RunPluginTask | | /app/core/runner/run_task.go:165 | | github.com/apache/incubator-devlake/core/runner.RunTask | | /app/core/runner/run_task.go:139 | | github.com/apache/incubator-devlake/server/services.runTaskStandalone | | /app/server/services/task_runner.go:114 | | github.com/apache/incubator-devlake/server/services.RunTasksStandalone.func1 | | /app/server/services/task.go:187 | | runtime.goexit | | /usr/local/go/src/runtime/asm_arm64.s:1172 | Wraps: (8) Error 1406 (22001): Data too long for column 'component' at row 174 (500) | Wraps: (9) Error 1406 (22001): Data too long for column 'component' at row 174 | Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.withPrefix (7) *withstack.withStack (8) *errutil.withPrefix (9) *mysql.MySQLError Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.leafError

What do you expect to happen

For the collection to run appropriately without error.

How to reproduce

Create a Jira connection with, I assume, very long component names associated with some issues.

Anything else

No response

Version

v1.0.2-beta5

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@aviwarner aviwarner added the type/bug This issue is a bug label Nov 21, 2024
Copy link

dosubot bot commented Nov 21, 2024

Hey @aviwarner! I'm here to help you with any bugs, questions, or contributions you have for the project. Let's tackle this issue together!

The "Data too long for column 'component'" error in the _tool_jira_issues table is likely due to the Components field being defined with a varchar(255) type, which limits the data length to 255 characters. To resolve this error, you can modify the Components field to use a text type instead, which allows for longer strings.

Here's how you can modify the migration script to change the Components field to text:

type jiraIssue20240103 struct {
    Components string `gorm:"type:text"`
}

After making this change, you should run the migration again to update the database schema. This will allow the Components field to store longer strings, resolving the "Data too long for column 'component'" error [1].

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added component/plugins This issue or PR relates to plugins severity/p1 This bug affects functionality or significantly affect ux labels Nov 21, 2024
@aviwarner
Copy link
Author

@dosu thanks bot :D it does look like it's already the text type

Copy link

dosubot bot commented Nov 21, 2024

If the Components field is already of type text, the issue might be elsewhere. It's possible that the error is occurring due to a different field or a specific data processing step. Here are a few steps you can take to troubleshoot further:

  1. Check Other Fields: Verify if there are any other fields in the _tool_jira_issues table that might have a length restriction and could be causing the error.

  2. Review Data: Look at the specific Jira issues being processed when the error occurs. There might be a particular issue with unusually long data in one of its fields.

  3. Logs and Stack Trace: Examine the logs and stack trace closely to identify if there's a specific part of the code where the error is triggered. This might give more context about which data or operation is causing the problem.

  4. Database Constraints: Double-check the database schema to ensure there are no constraints or triggers that might be affecting data insertion.

  5. Test with Sample Data: Try running the pipeline with a smaller dataset or specific issues to isolate the problem.

If none of these steps resolve the issue, consider providing more details or specific examples of the data causing the error for further assistance.

To continue talking to Dosu, mention @dosu.

@aviwarner
Copy link
Author

I've deleted everything and started the project fresh. Since I just got everything setup this isn't too much of an issue and it is working for now. If I hit the problem again I'll resurface. Once things are setup more, this process might be more onerous but for now I think I'm OK.

@aviwarner aviwarner reopened this Nov 21, 2024
@aviwarner
Copy link
Author

Re-opening. The issue has re-emerged when trying to collect 6mo of data (rather than I think 1mo). Is there a good way to identify the specific Jira issues that might cause the error?

@klesh
Copy link
Contributor

klesh commented Nov 22, 2024

What was the name of the failing subtask?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component/plugins This issue or PR relates to plugins severity/p1 This bug affects functionality or significantly affect ux type/bug This issue is a bug
Projects
None yet
Development

No branches or pull requests

2 participants