Novel Dataset Enhances Cotton Content Detection in Textiles

In a paper published in the journal Data in Brief, researchers highlighted the transformative impact of computer vision techniques on textile manufacturing. These methods have transformed the identification of imperfections in fibers and the measurement of cotton composition in fabrics, streamlining previously demanding and lengthy processes.

Sample images of the CottonFabricImageBD dataset. Image Credit: https://www.sciencedirect.com/science/article/pii/S2352340924006796
Sample images of the CottonFabricImageBD dataset. Image Credit: https://www.sciencedirect.com/science/article/pii/S2352340924006796

The study introduced a novel dataset of 1300 original fabric images, each with a specified cotton percentage ranging from 30% to 99% across 13 categories. The dataset expanded to 27,300 images with augmentation techniques, enhancing machine learning (ML) for accurate cotton content assessment.

Related Work

Traditionally, evaluating cotton content took a lot of work. Existing datasets often focus on woven fabrics or mixed types and are limited in scope. The need for datasets specifically for cotton percentage detection has hindered progress in automation. Detailed knowledge of cotton fiber content is crucial for effective garment recycling and material selection. The shift in consumer demand for sustainability and transparency in textiles underscores the importance of accurate cotton percentage identification.

Textile Fabric Dataset Preparation

An expert team, guided by textile manufacturing specialists, curated the data. Initial research was crucial to identify cotton designs within textiles using a device for counting threads before data collection. Locating suitable textile retailers and manufacturers and securing their approval for participation was a critical phase of the project. Having completed these preparatory steps, the team systematically collected and processed the data, encompassing several crucial tasks.

The data collection process involved capturing images of fabric samples from selected sites to compile the initial dataset. The team utilized the standard camera configurations on the Samsung M12 smartphone for their project. The team selected fabric samples containing varying amounts of cotton to ensure a comprehensive dataset.

Analysts examined tags or labels on garments to ascertain the cotton percentage of the fabric samples and measure the thread count. After collecting the fabric specimens and their respective labels, they employed a device to count threads to accurately determine the cotton rate. They also placed the machine flat and arranged threads in a line. Subsequently, a magnification lens was used for enhanced clarity, and the machine's counting grid helped count the threads.

The strand was tallied utilizing the measurement grid and recorded manually. Initially, threads in a perfect fabric sample consisting of 100% cotton were calculated, with across and down thread counts of 117 and 73, respectively. Team members recorded thread counts for the remaining fabric samples. The percentage in each sample was computed using these counts.

The cotton in fabrics was calculated carefully to classify the samples into various classifications. The traditional thread count calculation was employed widely in the fabric industry. The formula involves multiplying the sample's horizontal and vertical thread counts and an ideal fabric, dividing the sample thread count by the perfect thread count, and multiplying by 100 to get the correct percentage.

For instance, the fabric with 78 threads horizontally and 64 threads vertically would have a total count of 4992. A fabric with a 117 horizontal thread count (HTC) and 73 vertical thread count (VTC) has a combined thread count of 8541. The percentage of cotton is calculated by dividing the sample thread count by the ideal thread count and multiplying by 100, yielding a cotton percentage of 58.44%, rounded to 58%.

Data preprocessing included labeling and organizing collected images into folders for supervised learning. Each image was resized to 256 pixels and rescaled to a uniform value, standardizing the data and readying it for augmentation. Image augmentation methods like rotation, horizontal and vertical flipping, width and height shifting, shearing, and zooming were utilized to enrich the dataset, expanding to 27,300 images. These techniques replicate shifts in object positions within images, enhancing the model's proficiency in identifying objects across different positions, scales, and distances. Furthermore, they bolster the model's capacity for generalization.

These augmentation techniques introduce variability in the dataset, making it more robust against overfitting and enhancing the model's ability to generalize to unseen data. By simulating diverse image conditions, the model becomes adept at handling real-world variations in input.

Renowned studies have demonstrated the effectiveness of traditional augmentation methods in computer vision tasks. Techniques like rotation, flips, shifts, and zooming enhanced dataset diversity, ensuring robust training and validation of AI models for precise cotton composition detection in textiles advancing automation in the textile industry.

Conclusion

To sum up, computer vision techniques have revolutionized various sectors, encompassing textile production, by automating processes such as fiber defect detection and cotton content quantification. Traditionally labor-intensive, assessing cotton percentages now benefits from ML models trained on comprehensive datasets like the one introduced here.

This dataset of 1300 original images, expanded to 27,300 through augmentation, covers cotton percentages from 30% to 99%, facilitating robust model training and validation. By extracting key features from fabric images, these advancements promise enhanced accuracy and efficiency in determining cotton content using computer vision.

Journal reference:
Silpaja Chandrasekar

Written by

Silpaja Chandrasekar

Dr. Silpaja Chandrasekar has a Ph.D. in Computer Science from Anna University, Chennai. Her research expertise lies in analyzing traffic parameters under challenging environmental conditions. Additionally, she has gained valuable exposure to diverse research areas, such as detection, tracking, classification, medical image analysis, cancer cell detection, chemistry, and Hamiltonian walks.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chandrasekar, Silpaja. (2024, July 17). Novel Dataset Enhances Cotton Content Detection in Textiles. AZoAi. Retrieved on October 18, 2024 from https://www.azoai.com/news/20240717/Novel-Dataset-Enhances-Cotton-Content-Detection-in-Textiles.aspx.

  • MLA

    Chandrasekar, Silpaja. "Novel Dataset Enhances Cotton Content Detection in Textiles". AZoAi. 18 October 2024. <https://www.azoai.com/news/20240717/Novel-Dataset-Enhances-Cotton-Content-Detection-in-Textiles.aspx>.

  • Chicago

    Chandrasekar, Silpaja. "Novel Dataset Enhances Cotton Content Detection in Textiles". AZoAi. https://www.azoai.com/news/20240717/Novel-Dataset-Enhances-Cotton-Content-Detection-in-Textiles.aspx. (accessed October 18, 2024).

  • Harvard

    Chandrasekar, Silpaja. 2024. Novel Dataset Enhances Cotton Content Detection in Textiles. AZoAi, viewed 18 October 2024, https://www.azoai.com/news/20240717/Novel-Dataset-Enhances-Cotton-Content-Detection-in-Textiles.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Machine Learning Unveils Satellite Salinity Bias Patterns