Print 67 comment(s) - last by inighthawki.. on May 16 at 1:20 PM

Microsoft will finally answer the questions of DRM and more next week

Microsoft Corp. (MSFT) will announce its next generation console next Tuesday (May 21) ahead of the 2013 Electronic Entertainment Expo (E3).

Rumored to be named either the Xbox Infinity or Xbox 720, the upcoming console stirred controversy over rumors that it would use digital rights management (DRM) to ban used games.  Rude Twitter remarks led to the firing of one Microsoft staffer as the debate over the DRM grew heated.

No one knows for sure, though, whether Microsoft actually planned the DRM or is sticking to such possible plans after the controversy.

What is expected is that the Microsoft console will pack similar hardware to arch-rival Sony Corp.'s (TYO:6758already-announced PS4.  That console packs 8 GB of DRAM, aside a GPU and x86 CPU from Advanced Micro Devices, Inc. (AMD).  Given that NVIDIA Corp. (NVDAsays it isn't interesting in making console graphics, it seems likely AMD could pop up in the next Xbox, as well.

The PS4 -- which has not yet been priced -- is rumored to be priced between $400 and $500 USD.  The next Xbox is rumored to cost $500 USD, but also be available in an ad-subsidized form for $200 to $300 USD.

The console is rumored to introduce 1080p Kinect motion controls, a work-in-progress that's been written about in leaks stories for some time.

Source: ExtremeTech

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By AMDftw on 5/15/13, Rating: 0
By Crazyeyeskillah on 5/15/2013 9:52:48 AM , Rating: 4
It doesn't matter if it supports it, there is no way in hell games will be able to push that many pixels yet. The 7990 that just got released can't even deliver 25fps avg at 4k resolutions on any new AAA game - crysis 3, tomb raider, ect.

You can upscale it and show it at a higher resolution but this generation of hardware used in consoles isn't remotely close to being able to really drive that kind of resolution. Cable may be able to stream that high but a gaming console rendering at that fidelity, preposterous!

By StevoLincolnite on 5/15/2013 10:21:49 AM , Rating: 2
Well. According to rumors the 4k resolution support wouldn't be for games, just 4k resolution Blu-ray, which makes sense as Sony is a large pusher of the format.

However, I don't even see the consoles lasting long with games running at 1080P, the new Killzone has already sacrificed framerate (Aka, 30fps rather than 60fps) in order to drive up the visuals and that's a launch title!
So chances are the next generation consoles like the current generation consoles will eventually have all it's games at 720P or lower a couple years after release anyway, only so much you can do with fixed hardware before you cut back in some other areas.

Then again, if you are incredibly worried about visuals, resolutions, frame rates and visit Dailytech/Anandtech you can already get such things in games now via building your own Mini-ITX gaming PC and using Steam with big-picture mode.

By karimtemple on 5/15/2013 10:50:06 AM , Rating: 3
The problem with your analysis (aside from the fact that this is launch software) is that this game is targeting a far higher level of graphics quality to something you'd expect to see today at 1080p60. Current demo performance suggest that the game is locked down to 30fps, the implication of which is that the game can reach Xfps but goes back-and-forth between X and [something above 30fps]. The PS4 has the hardware to run Battlefield 3 on Ultra @ 1080p far higher than 60fps. All of this "Gen 8 games are going to be 1080p30" talk is foolish.

By BRB29 on 5/15/2013 2:09:01 PM , Rating: 2
Please note the following

1. Consoles are fixed specs and therefore codes can be optimized to run much better than an open platform.

2. The old hardware can run 720p and 1080p. What makes you think this one can't manage at least 1080p?

3. There's more than just resolution that affect fps. For example: shadow, texture details, tessellation, draw distance, physics, polygons, HDR, AA, AF, etc...

Limiting 30 fps may be a good thing. They can add more graphic. It can be used to stop some stuttering or frame spike issues. For example, not limiting the frames on SC2 caused frame lag spikes.

BTW the new consoles have equivalent of a Radeon 7850 GPU. I'm willing to bet they can get games to run much smoother and at higher fps on a console.

By karimtemple on 5/15/2013 2:22:11 PM , Rating: 2
The GPU is more powerful than the 7850.

Also, part of his concern was multi being locked at 30fps, but the multiplayer parts of games are often rendered differently, so that could still be 60fps.

By StevoLincolnite on 5/15/2013 8:42:21 PM , Rating: 2
The GPU is more powerful than the 7850.

Not really. Remember GDDR5 bandwidth is shared on the PS4, I would be surprised if the graphics get more than 150Gb/s.
Clockspeeds are lower so it will limit the total compute performance.
It can have the hardware of a Radeon 7970, but if it's clocked low enough, it probably won't be any faster than a 7850 or 7790.

By karimtemple on 5/16/2013 1:14:43 AM , Rating: 3
The 7850 feeds itself about 150Gb/s. The PS4 is currently spec'd at 176. The GPU has 18 compute units vs. the 7850's 16, will be HSA-compatible, and will routinely have access to 6 or 7GB of VRAM vs. 2. This is aside from the fact that the compute units will probably each be customized and more advanced than the 7850.

By Mint on 5/15/2013 11:50:10 AM , Rating: 2
Choosing ultra high resolution is a tradeoff. You'll obviously have to tone down the graphic workload per pixel versus 1080p.

That doesn't mean we have to settle for Quake level visuals at 1080p. Even a lowly HD7790 can do 1.8 TFLOPS, which works out to 3600 ops per pixel for 4k at 60fps. That's an oversimplification, but clearly there's enough horsepower to give decent visuals at 4k.

By karimtemple on 5/15/2013 9:53:29 AM , Rating: 2
They both will, but most of it will probably be upscaling. I'd expect 99% of the games themselves to render at 1080p.

By AMDftw on 5/15/2013 12:45:42 PM , Rating: 1
All I wanted it for was for Movies. Games, I could care less about in 4K.

By B3an on 5/15/2013 6:45:08 PM , Rating: 3

Do you even know what you're saying?!

By Arsynic on 5/15/2013 1:09:53 PM , Rating: 2
Sony will support 4K for movies only. Not games.

By rcabor on 5/15/2013 11:39:41 AM , Rating: 3
Given that NVIDIA Corp. (NVDA) says it isn't interesting in making console graphics,

Had to read that twice, it made my head hurt. :)

RE: Interested?
By Motoman on 5/15/2013 1:00:48 PM , Rating: 3

AMD won everything this round, so now we're going to pretend we didn't want it anyway.

RE: Interested?
By karimtemple on 5/15/2013 1:52:21 PM , Rating: 2
It is true that margins on embedded chips are really low, though.

But you're right. I think what really happened here is there was no way for nVidia to offer a superior value proposition to AMD -- they don't have powerful CPUs. They're doing some acceptable ARM stuff, but not even ARMv8 was going to cut it when you're talking about the successors to the 360 and PS3. They also don't have an HSA-compatible GPU. AMD had the total platform package.

RE: Interested?
By rippleyaliens on 5/15/2013 2:45:11 PM , Rating: 2
Sure AMD Won, BUT what did they win?? Contracts to build over 5 years, 100 million Devices?? (Being Generous)..

Unfortunately this is a failure. With Console Designs LOCKED IN, for the 5 year Process.. IN 2 YEARS, 4k will be commodity pricing, as they already have a 55' 4K Tv, selling for $1300..

people say, "I only want it for Movies"-- SURE YOU DO..
Once a GAME, Truly pushes the Boundries, and shows what 4K can do, THINGS WILL CHANGE.. They always have, always will..
Same was said, from vga, to svga, -----1080P, to Eyeinfinity-- AND now 4K!!!..

Remember folks, IF you cant afford it, that doesnt mean, that it wont exist. Not to say, anything money wise. BUT -- 1080P, was considered "Not needed", YET can go into wallyworld and get a 50' 1080P for under $500..

Amd won the right to be on 2x consoles, that have a shelflife of 6-7 years. 7 years, ago, we were running single core processors, with video cards topping out at 512 MB of ram. 250Gb Hard drives just hit. Laptops had 20-40GB hard drives. no taplets, nada..

SO yes AMD won.. NOW they have to allocate resources to support 2 consoles.

RE: Interested?
By karimtemple on 5/15/2013 2:58:32 PM , Rating: 2
An AMD GPU is on the Wii U as well.

RE: Interested?
By Manch on 5/16/2013 9:16:18 AM , Rating: 2
55' 4k TV for $1300?!?!?!!! I dont think there is even room in my yard for it! oh wait you mean 55" :D

Here's the quick n dirty so I may fuzz over some details. AMD is a design house. They dont fabricate anything anymore. They may be designed by AMD but Sony, Nintendo, and MS bought/licensed their specific designs from AMD. MS with the original XBOX didnt do this. They were using an COTS proc and GPU so MS pushed hard to launch the 360 ASAP because supply channels were dryed up and shortly afterwards so did the consoles. With the 360 by buying/licensing the design, they had total control and do not have to pay the exorborant price they did with the original xbox parts and have control over their supply.

So allocation fo resources by AMD is minimal at this stage. At most they will support die revisions over the life of teh console, but other than that its a steady source of income for AMD for the near future.

RE: Interested?
By karimtemple on 5/16/2013 9:43:50 AM , Rating: 2
AMD is not doing licensing this generation, they're doing the hardware. It's a gamble, but there's a lot more money to be had. When you're struggling, you tend to lock in revenue streams whenever possible.

RE: Interested?
By Manch on 5/16/2013 12:17:31 PM , Rating: 2
Oh yeah bc of the x86 portion of the APU? Still those respective designs will remain exclusive to the respective buyers and MS/Sony/Nintendo don't have to worry about not having chips available. As long as those consoles sell, then it should be a pretty steady stream of revenue for AMD. Plus as die processes shrink, they will save money/increase profit.

RE: Interested?
By kilkennycat on 5/15/2013 3:11:59 PM , Rating: 2
Since AMD is verging on bankruptcy, you can bet that Sony (and MS, if they likely use AMD) will have contract-established the complete design-update and production rights to the AMD console chips in the event that AMD go bellyflop.... This is a cut-throat-margin business. See the following for one historical reason why nVidia does not want to be involved:-

RE: Interested?
By Cheesew1z69 on 5/16/2013 12:11:56 AM , Rating: 3
They are on the verge of bankruptcy? I doubt that will ever happen as that would give Intel a virtual monopoly.

Cant Wait
By Mitch101 on 5/15/2013 9:56:58 AM , Rating: 3
I hope Microsoft doesn't stick to DRM for used games. Not everyone can afford new games especially when they launch at $60.00 a pop but I can understand if they do this because game consoles seem to take a lot to get to profitability.

In all honesty I bought a 360 this round because at the time I bought the PS3 was out of my price range just for the console. Now part of me is tempted to get a PS4 and alternate between systems every other console release or until the price of the console drops to like they are now. Why? Well the exclusives that one side of the fence gets. I love Halo and Gears but Im sure I will love the competitors on the Sony side as well.

The one thing that might keep me in camp Microsoft is Kinect. Sure some of the games dont go over well but the kids love it and they get some excecise and the Wife loves dance central. I have been tempted lately to pick up a PS3 to play some of those exclusives not on the 360.

Either was I think Nintendo is losing this round. You hear nothing of the Wii-U but fingers crossed the replacement for the DS/DSi/3DS is a hit. I like having nintendo around.

RE: Cant Wait
By Crazyeyeskillah on 5/15/2013 10:03:39 AM , Rating: 2
i bought a wii u and sold it 3 weeks later, there were no games and i wasn't going to sit on it for a year before something came out. It's really unfortunate, this isn't the n64 generation where you could buy the console and play mario 64 for 6 months with no complaints. I'll look at it again when xenoblade successor comes out but it's really unfortunate they came out with a great little device and no real support for it.

RE: Cant Wait
By jimbojimbo on 5/15/2013 10:55:20 AM , Rating: 2
I'm probably just going to buy this as well with the Kinect system for my niece and nephew. Keep it away from home and I can still be productive, they get some exercise out of it, and we can play like idiots when I visit. Win win win.

RE: Cant Wait
By wempa on 5/15/2013 12:30:52 PM , Rating: 2
I hope Microsoft doesn't stick to DRM for used games. Not everyone can afford new games especially when they launch at $60.00 a pop but I can understand if they do this because game consoles seem to take a lot to get to profitability.

There are more user-friendly ways to encourage people to buy new games over used ones. I liked the way Blizzard used to do it with the first 2 Diablo games. You needed a valid CD key in order to play online. I think they should focus on things like that and, of course, making the games as great as possible. I think that is a better way to deal with the used games market. Fighting it with things like online DRM will just push people away harder. That's what has happened with me. I stopped supporting Blizzard when they dropped LAN support with Starcraft 2. It's not like there are a shortage of alternative games/systems. I'll gladly keep my $60 and pick up a great $10 game that's a few years old and will keep me entertained for a while.

Ad subsidized
By wallijonn on 5/15/2013 12:14:00 PM , Rating: 2
If it's ad subsidized then it will need to be an always on network connection. And if it's always connected then chances are DRM is working in the background. Otherwise they'd have to code commercials and promos into the game. That is as useless as movie trailers on DVDs and BDs, as useless as BD promos on BD discs - if it's a lousy trailer you will have to live with it for the next 12 to 20 years.

RE: Ad subsidized
By acer905 on 5/15/2013 12:31:14 PM , Rating: 2
Well, Amazon sells ad-supported versions of the Kindle which still have the ability to disable wi-fi. If the ads need to be refreshed, it just "prompts" you to turn the wireless back on so it can refresh.

Its possible they could have a section of storage dedicated to a handful of ads that refresh whenever you happen to connect it to the internet.

citation needed
By Scabies on 5/15/2013 5:00:57 PM , Rating: 2
Should probably stop calling it the Infinity, since the name was fan*-made

If MS maintains this course....
By dxf2891 on 5/16/2013 10:05:41 AM , Rating: 2
of DRM, I foresee the decline and ultimate death of the Xbox franchise. If this is included, they will lose me as well as thousands if not millions of others as customers.

By Motoman on 5/15/13, Rating: -1
RE: x86?
By karimtemple on 5/15/2013 12:56:34 PM , Rating: 2
Same thing. x64 (read: x86-64) is an extension of x86.

RE: x86?
By Motoman on 5/15/13, Rating: 0
RE: x86?
By Cheesew1z69 on 5/15/2013 1:03:07 PM , Rating: 1
x86-64 (also known as x64) is a 64-bit extension of IA-32, the 32-bit generation of the x86 instruction set. It supports vastly larger amounts of virtual memory and physical memory than is possible on IA-32, allowing programs to store larger amounts of data in memory. x86-64 also provides 64-bit general purpose registers and numerous other enhancements. The original specification was created by AMD, and has been implemented by AMD, Intel, VIA, and others. It is fully backwards compatible with 16-bit and 32-bit x86 code.[1](p13-14) Because the full x86 16-bit and 32-bit instruction sets remain implemented in hardware without any intervening emulation, existing x86 executables run with no compatibility or performance penalties,[2] whereas existing applications that are recoded to take advantage of new features of the processor design may achieve performance improvements.

RE: x86?
By Motoman on 5/15/13, Rating: -1
RE: x86?
By Motoman on 5/15/13, Rating: -1
RE: x86?
By karimtemple on 5/15/2013 1:29:33 PM , Rating: 2
lol. But this isn't about software. We're talking about processor architecture. The architecture is x86.

RE: x86?
By zephyrprime on 5/15/2013 1:38:19 PM , Rating: 2
It's actually not uncommon for "x86" to refer to the entirety of x86 architechture including x64. People speak imprecisely. I'm sure the software will all be 64bit since 64bit on x86 is a little faster than 32bit software.

RE: x86?
By karimtemple on 5/15/2013 1:57:47 PM , Rating: 3
Honestly, it's un common for hardware to be referred to as "x64," after like 2006. Once Intel switched over, nothing was 32-bit anymore except netbooks (which died quickly and painlessly in 2010).

RE: x86?
By Concillian on 5/15/2013 2:00:18 PM , Rating: 2
How often does x86 refer to only the 32 bit version of the architecture when you are talking about a system with 8GB memory standard?

Who cares what conventions are. If you use your brain it's pretty obvious what they're describing in this particular case.

RE: x86?
By karimtemple on 5/15/2013 2:08:00 PM , Rating: 3
In his defense:

1) He was clearly thinking about the convention marking 32-bit/64-bit software , which does persist to this day.

2) There are ways to get a 32-bit processor to talk to more than 4GB of memory; it's just that they're pointless when you can just build a 64-bit part.

RE: x86?
By Dug on 5/15/2013 2:46:13 PM , Rating: 2
You really don't know what you are talking about. All you are doing is quoting from wikipedia which is trying to explain to you that this denotes software.
x86 does not mean 32-bit which is clearly explained.

And I would hope that the mods ban you for such immature language and inability to comprehend.

RE: x86?
By Motoman on 5/15/2013 8:16:07 PM , Rating: 1
You people are utterly retarded. You must've been born 5 minutes ago if you think x86 isn't commonly used to describe 32-bit *period* - software or hardware - just as x64 is used to refer to 64-bit. Across the board.

The depth of the ignorance of people on this site is just astounding sometimes. Pull your heads out of your a$ses and look around once in a while.

RE: x86?
By inighthawki on 5/15/2013 9:57:08 PM , Rating: 2
Anyone who actually deals with hardware will tell you it's incredibly common to referring to "x86" as any CPU created based on Intel's x86 family. This means x86 and x86-64. x86 is more commonly just synonymous with being the "Intel" based architecture (or at least the common one).

So maybe you should pull your head out of your a$s because you're clearly so full of yourself that you can't accept that you might be wrong every once in a while. Your arrogance is astounding.

RE: x86?
By Motoman on 5/16/2013 10:42:17 AM , Rating: 2
Your lack of any recognition of reality is well-established here in these forums.

It's quite possible that I've been working in the computer industry longer than you've been alive.

You. Are. Wrong.

Deal with it.

RE: x86?
By Cheesew1z69 on 5/16/2013 11:11:27 AM , Rating: 2
Sorry, but it's you who is wrong.

RE: x86?
By inighthawki on 5/16/2013 11:26:18 AM , Rating: 2
lol, if only you could hear yourself talk. it always amazes me how people who are dumb and THINK theyre smart try to give themselves so much credit.

Hate to break this to you, but you're way overconfident in your analyses and you also have a severe anger issue, you may want to try therapy.

I don't have to deal with anything because I and the other dozen people posting against you are correct. Perhaps it is you who needs to accept facts and deal with it. You've been in the computer industry longer than I've been alive and you aren't that good at it.

RE: x86?
By Cheesew1z69 on 5/15/2013 9:41:02 PM , Rating: 2
You have some serious anger issues, Jesus Christ. I think you need to get off the computer and talk a walk or something.

RE: x86?
By bsd228 on 5/15/2013 2:40:28 PM , Rating: 2
So you're both going to pretend that the convention isn't to use "x86" to refer to 32-bit stuff and "x64" to refer to 64-bit stuff?

It really isn't the convention you say it is. (and it's really stupid to be so angry about it)

x86 has long referred to processors (intel, amd, a few small players) that are capable of running the 8086-80686 instructions. In contrast, you have Power, ARM, Core, Sparc, etc.

Going into linux/unix land, i386 is often your designation for 32 bit, with x86_64 for 64bit.

RE: x86?
By Cheesew1z69 on 5/15/2013 9:46:45 PM , Rating: 2
He actually acts like this about a lot of subjects on this site, that's very common for him to act this way.

RE: x86?
By inighthawki on 5/15/2013 9:58:21 PM , Rating: 2
Well of course, didn't you know that he is the gatekeeper of all knowledge, and is always right about everything, including areas he clearly has no expertise in?

RE: x86?
By Cheesew1z69 on 5/16/2013 12:07:52 AM , Rating: 1
I actually was thinking about how much of a know it all he thinks he is, also it's quite amusing he acts like a total douche when I posted what I did, then posts the same shit with what basically says the same thing I did...

RE: x86?
By Cheesew1z69 on 5/16/2013 1:15:26 AM , Rating: 1
Also, there are a few others just like that on this site...

RE: x86?
By Motoman on 5/16/2013 10:47:17 AM , Rating: 2
I have more expertise about computer hardware in my left pinky than either of you children will ever learn.

The insistence of toddlers that long-established norms like x86 vs. x64 to denote 32-bit vs. 64-bit somehow never existed does nothing but underscore the fact that you've not actually been involved in this industry...or at least, not since you graduated from kindergarten a few days ago.

It's an indisbutable fact that the terms are exceedingly commonly used to differentiate between the entirety of either the 32-bit or 64-bit platforms. CPUs and software.

Two seconds on Google would verify that - if you children could be bothered to learn WTF you're talking about before you spout off.

RE: x86?
RE: x86?
By Cheesew1z69 on 5/16/2013 11:08:25 AM , Rating: 2
illiteratehack writes "10 years ago AMD released its first Opteron processor, the first 64-bit x86 processor . The firm's 64-bit 'extensions' allowed the chip to run existing 32-bit x86 code in a bid to avoid the problems faced by Intel's Itanium processor. However AMD suffered from a lack of native 64-bit software support, with Microsoft's Windows XP 64-bit edition severely hampering its adoption in the workstation market." But it worked out in the end.
Let's see how Mr. Knowitall spins this one.

RE: x86?
By Motoman on 5/16/2013 11:37:31 AM , Rating: 2

So on and so forth. It's clear that normal people normally use x86 and x64 to differentiate between 32-bit and 64-bit.


It's over. The snippet you pasted up there says nothing to dispute that fact. You have one person acknowledging that AMD incorporated 64-bit extensions into the x86 framework, creating what formally was referred to as x86-64...but continuing forth was differentiated from traditional x86 platform stuff by stating either x86 or x64.


Not to mention that it's rather ironic that you quote someone who calls themself "illiterate hack."

RE: x86?
By karimtemple on 5/16/2013 11:46:34 AM , Rating: 2

RE: x86?
By Motoman on 5/16/2013 12:22:30 PM , Rating: 2
OK. Point of interest...maybe this actually is more of a "generational thing."

Informal poll of a dozen or so IT types that were milling around the IBM office where I happened to be working with an old colleague...

There were only a couple younger guys in the office - like, in their 20s. Everyone in their 30s and over stated that they intuitively think of x86 as indicative of 32-bit, and normally would use x64 to refer to 64-bit stuff - hardware or software.

The 2 guys in their 20s said they never use "x64" and just refer to any such hardware or software intended to be used on a Windows platform as "x86."

At which point the rest of us told them to stay off our lawns.

I think it probably has to do with them not really "living" through the transition of 64-bit extensions into the platform. Neither of them really have any recognition of Itanium either. We had to explain to them what the hell that even was, and why it ultimately failed. Well...they understood why it ultimately failed, once we described it to's just kind of amazing that they didn't know.

RE: x86?
By karimtemple on 5/16/2013 12:54:15 PM , Rating: 2
Nice try. 8/10.

(I'm 30 BTW)

No x86 hardware is 32-bit anymore. What are all these 32-bit parts that need their own nomenclature? "x64" is a software term. Microsoft uses it to distinguish between their 32- and 64-bit software. Even your links say that.

RE: x86?
By inighthawki on 5/16/2013 1:20:45 PM , Rating: 2
I'm not sure it is. I work with a number of people who have been in the industry for 20+ years and they use the term x86 to refer to anything in the x86 family, including x86-64.

I lived through the 64-bit transition and know fully well what Itanium is, but that doesn't change anything.

RE: x86?
By Cheesew1z69 on 5/16/2013 11:04:27 AM , Rating: 2
I have more expertise about computer hardware in my left pinky than either of you children will ever learn.
Keep dreaming... kid.

RE: x86?
By inighthawki on 5/16/2013 11:30:55 AM , Rating: 2
The insistence of toddlers that long-established norms like x86 vs. x64 to denote 32-bit vs. 64-bit somehow never existed

Another prime example of your delusion, since nobody ever said that.

What you ARE wrong about is that x86 is an uncommon term used to describe any CPU from the x86 family, 32 OR 64 bit. Nobody claimed the term x64 doesn't exist or that it cannot be used to refer to a 64-bit x86-64 CPU.

RE: x86?
By Concillian on 5/15/2013 1:57:28 PM , Rating: 2
The article talks about 8GB memory standard and you think x86 means 32 bit?

x86 can also be used to describe the general architecture. You know like ARM, PowerPC and Cell architectures that other gaming systems have been designed around?

RE: x86?
By geekman1024 on 5/16/2013 1:08:20 AM , Rating: 1
Dude! 86 is larger than 64! That's why! Bigger is Better!

"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki