An die Entwickler: Wie oft nutzt ihr KI-Tools wie Copilot mittlerweile im Alltag?

(Mir ist bewusst, dass ML-Modelle keine echte KI sind, wird aber im allgemeinen Volksmund so bezeichnet. Ihr wisst, was ich meine. ;-))

Auch, wenn es die Option “Brauche ich nicht” gibt, ist damit eher eine persönliche Überzeugung gemeint. “Brauchen” würde es theoretisch natürlich kein guter Entwickler.

(4 votes)
Loading...

Similar Posts

Subscribe
Notify of
18 Answers
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Palladin007
9 months ago

Well, GitHub Copilot is integrated in Visual Studio and completes what I write. “Oft” is a little understatement, when programming, I use it permanently 😀 I don’t have to generate complex logic (usually not anyway), but it takes me a lot of writing work because I don’t have to write everything out.

And everyone who says they don’t use AI, lie or have no idea.
What many understand AI is just what is media-effective, i.e. ChatGPT and the SciFi garbage, but actually the topic is huge and for many years has long been an integral part in many areas.

Visual Studio has had its own self-learning AI support even longer before GitHub Copilot, GitHub Copilot is just better. Google works with AI, otherwise it wouldn’t be possible. I could also imagine that various spam filters work with these technologies at the e-mail providers. Social Media works with such systems to offer personalized suggestions. And so on.

And indirectly we also use it, as various technologies are optimized with AI, relatively current example: nuclear fusion. The actuator was designed by a specially designed system. I don’t know, but it should have been a genetic algorithm.

And all this has not been part of ChatGPT. Since ChatGPT, it has suddenly become known for the broad mass, and for years AI has always been subject to fake news and legitimate fear of manipulation by fake images, videos and sound recordings. Like AI accounts on Instagram, I mean years before ChatGPT.

Palladin007
9 months ago
Reply to  Palladin007

Oh yes and subject reliability:

Such an LLM is not really super reliable.

With time one learns to estimate the answers, so I often ask ChatGPT very specifically questions that illuminate the subject from different angles and then compare the answers, so one can usually recognize false statements quite quickly because they simply do not match each other. And then, of course, the explanation is also a decisive factor, because sometimes inaccuracies are hidden where I can then drill and find errors.

And, in addition, if the subject is more important, you have to read it again and again, of course, but that’s what you should do.

Palladin007
9 months ago
Reply to  Palladin007

“Brauchen” would not, of course, be a good developer.

I would call myself a good developer, but yet: “I need”
Just because I am writing and impatient at work 😀 I meanwhile work so fast, also by autocompletion, which has so moved me into flesh and blood that I just don’t want to miss it anymore.

GamersGame
9 months ago

I was very upset at the beginning, but meanwhile I use it in part. Programming with the help of AI will be the future, even though many are strung against it. Rightly used, it can currently create smaller scripts without a lot of logic connections (e.g. a regex). Even more complex things can be created, but you always have to work on it, often KI also makes mistakes and creates unfunctional code. However, it is often faster to work on it than to write it completely.

The whole is, of course, situation-dependent, for internal and B2B areas, the performance and quality of the code i.d.R. is rather secondary, unlike for B2C applications, as the performance and quality of the code (e.g. semantics) also influences Google SEO.

The advantage and the reason why programmers will use AI in the future is simply saving time, the trend has been around for the last few years, the reason why you use a libary is simply: saving time.

Advantage: Create quick and cost-effective web projects

Disadvantage: Smaller developer team required

Xandros0506
9 months ago

Not at all.

I have still learned to bring together my necessary information from the available sources and to learn something from them.

If I were to use AI, I would simply have to make proposals that do not have to be correct. It follows that I have to collect the information on the usual way in any case, so that you can verify the results of the AI. So make more work that no one pays me.

Hannes178
9 months ago
Reply to  Xandros0506

Ki is a much more efficient research. As googles or something

Gehilfling
9 months ago

+ “No, I can’t.”

Company policies prohibit the pack of requirements (also as a full text) into an AI that logically takes this as a learning data.

In addition, I have been able to do without any problems.

MonkeyKing
9 months ago
Reply to  Gehilfling

Company policies prohibit the pack of requirements (also as a full text) into an AI that logically takes this as a learning data.

Not necessarily. Not in an OpenAI Team Account.

apachy
9 months ago

Not at all. Didn’t find any real use case for it yet. The problem is that the results are often not correct. This is quite okay if you can create an image where creativity is right and there is no right or wrong. Of course, software isn’t that easy.

Most of what I would use it, but I have already (partly)automated. Snippets, Templates, Multiline Editing, Regex Replace or create code using code or SQL Engine.

To this end, the focus of this chatbot is naturally breaking down. Co-Pilot costs and has no integration, for example, for my most used IDE. Of course, it’s problematic to copy code.

However, I usually have no problem what I want to translate into code. The problems are usually more likely to understand the problem, touch it, communicate with the customer etc. pp.

But consider that in a few years the stuff can run well and quickly locally, has more integrations and will be something like a better Snippet collection with a little context behind it.

I’m sceptical about quality. Is trained with the current stuff, both good and bad. It’s a mean thing. The use spreads more and more, new training results then come from good and bad of this medium size etc. Could trigger a downward spiral.

JokesOnYou
9 months ago

Automatically created code is common and give to us, takes a lot of work off, especially if it has to go fast. Since where the code does not meet the expectations, I am satisfied with the other.

JokesOnYou
9 months ago

I think I misjudged this. We do not use AI to write really important code, but to generate basic classes or create interfaces. Business logic is strictly confidential and must not be worn outside, so we can really use the tools only for very simple things.

ingosteinkeseo
2 months ago

At the moment I occasionally use the “intelligent” autocompletion of JetBrains, but I am still skeptical, because usually the quality is far too bad.

EinAlexander
9 months ago

How often do you use AI tools like copilot in everyday life?

Almost daily. It saves me routine work during programming as well as the follow-up in manuals, if I stand before a rare task that I have to solve in a language that I rarely use.

Alex

Hannes178
9 months ago

Has often helped me find a solution that would have taken him for several hours.