匍匐 发表于 2025-3-30 08:20:21

Development of a Deep Learning Model for the Classification of Mosquito Larvae Images with a cellphone camera by comparing various Deep Learning models (Mobilenetv2, ResNet18, ResNet34, EfficientNet_B0 and EfficientNet_Lite0). Best results were obtained with EfficientNet_Lite0 with an accuracy of 97.5% during validation and 90% during testing, an acceptable result considering the ri

EVEN 发表于 2025-3-30 16:01:17

http://reply.papertrans.cn/48/4701/470003/470003_52.png

猛击 发表于 2025-3-30 18:39:32

Crop Row Line Detection with Auxiliary Segmentation Tasks proposed to take advantage of tunnels formed between rows. A simulation environment for evaluating both the model’s performance and camera placement was developed and made available in Github, and two datasets to train the models are proposed. The results are shown across different resolutions and

有害 发表于 2025-3-30 23:45:10

Multiple Object Tracking in Native Bee Hives: A Case Study with Jataí in the Field of errors in the system. We must consider a coupling of both systems in practical applications because ByteTrack counts bees with an average relative error of 11%, EuclidianTrack monitors incoming bees with 9% (21% if there are outliers), both monitor bees that leave, ByteTrack with 18% if there ar

CON 发表于 2025-3-31 03:54:34

http://reply.papertrans.cn/48/4701/470003/470003_55.png

Adjourn 发表于 2025-3-31 06:39:08

Sabiá: Portuguese Large Language Modelsturbo. By evaluating on datasets originally conceived in the target language as well as translated ones, we study the impact of language-specific pretraining in terms of 1) capturing linguistic nuances and structures inherent to the target language, and 2) enriching the model’s knowledge about a dom

入伍仪式 发表于 2025-3-31 10:52:38

http://reply.papertrans.cn/48/4701/470003/470003_57.png

Cultivate 发表于 2025-3-31 16:43:17

http://reply.papertrans.cn/48/4701/470003/470003_58.png
页: 1 2 3 4 5 [6]
查看完整版本: Titlebook: Intelligent Systems; 12th Brazilian Confe Murilo C. Naldi,Reinaldo A. C. Bianchi Conference proceedings 2023 The Editor(s) (if applicable)