X

Compare

Overview

ChatGPT vs Claude for code Comparison in different aspects of AI services with data mining from genuine user reviews & ratings, including: ALL,Interesting,Helpfulness,Website Frontend,Correctness. AI store is a platform of genuine user reviews,rating and AI generated contents, covering a wide range of categories including AI Image Generators, AI Chatbot & Assistant, AI Productivity Tool, AI Video Generator, AI in Healthcare, AI in Education, AI in Lifestyle, AI in Finance, AI in Business, AI in Law, AI in Travel, AI in News, AI in Entertainment, AI for Kids, AI for Elderly, AI Search Engine, AI Quadruped Robot.

Reviews Comparison

Recommended Tags



  • DerekZZ 2024-11-05 12:16
    Interesting:4,Helpfulness:4,Website Frontend:3,Correctness:4

    I asked ChatGPT (GPT4o) question about python coding "sandbox python programs with Docker environment". The responses are helpful and the generated code samples are clear. But the website front end is having a URL rendering issue of the "Docker installation guide". The font is highlighted so it means it's a link. But there is no href or URL associated with the text and I can't find the installation guide it refers to. Is it a known problem? Reproducing links: https://chatgpt.com/share/67298cb0-47c0-8005-96ea-0562df9f7158



  • ChenYZ 2024-10-29 12:08
    Interesting:4,Helpfulness:5,Correctness:5

    I asked ChatGPT (GPT 4o) a coding related questions "export prompt and completion data in json format and convert to parquet format", and the python code generated by ChatGPT are quite good. It gives me examples of detailed json data format, python code to convert raw json data to Parquet format and even recommends all the python libs requirements. Very helpful response from ChatGPT(GPT4o) !!! Responses from ChatGPT: Step 1: Export Data in JSON Format Step 2: Convert JSON to Parquet using Python Explanation of the Code Prerequisites. And the ChatGPT dialogue sharing link is here: https://chatgpt.com/share/67205caa-a068-8005-be6b-76eeed480fbe



  • wilsonz1989 2024-09-03 18:24
    Interesting:3,Helpfulness:3,Correctness:5

    ChatGPT did a good job on this python code test, which is to "implement self attention layer in transformer using pytorch package". Overall, the python code is correct and concise. After comparing ChatGPT vs Gemini vs Claude for coding, I find that the overall response from ChatGPT lacks of detailed explanation of meanings of each section of the code, which is less helpful than the response from Claude Sonnet.



  • wilsonz1989 2024-06-19 00:40

    My question for ChatGPT is "Show me the latex code of KL Divergence". Even though it gives me the latex code for KL Divergence already in the front, it provides a lot of unrelated information of basic latex usage such as how to include the package, which is not very relevant to my original intent.



  • William Garcia 2024-06-19 00:37

    I asked ChatGPT to write the python code for QuickSort Implementation. The results are attached. Pretty Amazing, right?



  • DerekZZ 2024-06-18 14:29

    ChatGPT provides very good and clear answers to my question "Please give me some introduction of Maxwell equations". The best part is that it also display the equation in beautiful format with latex code generated, such as the equation in Gauss's Law for Electricity and others. Compared to the answers by Gemini, it definitely wins.




  • zyncg 2024-09-05 08:24
    Interesting:4,Helpfulness:4,Correctness:4

    Claude does an amazing job writing code for the trap water Leetcode. It also provides explanation of the algorithm complexity.



  • wilsonz1989 2024-09-03 18:06
    Interesting:3,Helpfulness:5,Correctness:5

    I've tested the AI model implementation ability of multiple AI Code Generators. And for this prompt "Self Attention Layer in Transformer", Claude provides the best overall response. The pytorch implementation of Transformer model's self attention layer is correct. And the detailed explanation is really helpful. For example, it introduce the breakdown of the python code, including the "__init__" method, the forward method, the input tensor, the attention energy function, etc. And I will also attach the detailed text response here: """ Claude Sonnet 3.5 Here's a breakdown of how this code works: The __init__ method initializes the linear layers for the value, key, and query projections, as well as the final linear layer that combines the multi-head outputs. The forward method takes in the values, keys, query, and an optional mask tensor. The input tensors are reshaped to have a separate dimension for the number of heads, so that we can perform multi-head attention in parallel. The value, key, and query tensors are projected using the respective linear layers. The attention energy is computed using the einsum operation, which performs the scaled dot-product attention operation across all heads in parallel. If a mask tensor is provided, the attention energies corresponding to the masked positions are set to a very large negative value, so that they get zeroed out after applying the softmax. The attention probabilities are computed by applying softmax to the attention energies. The values are weighted by the attention probabilities using another einsum operation, and the results are concatenated across the head dimension. Finally, the concatenated tensor is projected using the final linear layer to obtain the output of the self-attention layer. You can use this SelfAttention module as a part of your Transformer architecture by applying it to the input embeddings and optionally using the output as the input to the feed-forward network. """