-
Notifications
You must be signed in to change notification settings - Fork 116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to measure MACs? #37
Comments
Hi @DoranLyong , Thanks for your attention. I used to count the MACs of PoolFormer similar to this code. However, I found it convenient and accurate enough to use the package fvcore. An example is shown in misc/mac_count_with_fvcore.py. I will update the MAC results measured by fvcore on the arXiv recently. |
Thanks for your kind response and code example! I did check your code and got confused about why you said that FLOPs by fvcore are actually MACs. I understand that FLOP counts both addition and multiplication in separate, but MAC considers both at once. Formally, MACs = When I checked your code in misc directory, I expected there should be division by 2 but there isn't. Can I ask let me know why FLOP here actually means MAC? |
Hi @DoranLyong , This is a common typo in many papers on computer vision. FLOP in these papers actually means MAC. For example, ResNet-50 actually has 8.2G FLOPs and 4.1G MACs. The package fvcore also follows this common typo. You can check it by specifying the model in misc/mac_count_with_fvcore.py. model = timm.models.resnet50()
# or
# from torchvision.models import resnet50
# model = resnet50() The output of 4.1G actually means MACs. |
got it! =) thank you very much! I learned a new thing owing to you. |
You are welcome :) |
Hi, thanks for your nice work :)
I also watched your presentation record through this conference.
I want to apply the poolformer for my work, can I ask how did you measure the MACs of the architecture introduced in your paper?
Or if you were not bothered, I want to ask if I could be shared your measurement code.
The text was updated successfully, but these errors were encountered: